WO2007015184A2 - Appareil et procede permettant de determiner automatiquement des parametres de confidentialite pour un contenu - Google Patents
Appareil et procede permettant de determiner automatiquement des parametres de confidentialite pour un contenu Download PDFInfo
- Publication number
- WO2007015184A2 WO2007015184A2 PCT/IB2006/052482 IB2006052482W WO2007015184A2 WO 2007015184 A2 WO2007015184 A2 WO 2007015184A2 IB 2006052482 W IB2006052482 W IB 2006052482W WO 2007015184 A2 WO2007015184 A2 WO 2007015184A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- content
- privacy
- privacy setting
- rules
- recommended
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6209—Protecting access to data via a platform, e.g. using keys or access control rules to a single file or object, e.g. in a secure envelope, encrypted and accessed using a key, or with access control rules appended to the object itself
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/21—Indexing scheme relating to G06F21/00 and subgroups addressing additional information or applications relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/2101—Auditing as a secondary aspect
Definitions
- the invention relates to the automatic recommendation of privacy settings for content, such as audio-video content, photographs, and other documents. Particularly, aspects of the invention pertain to making recommendations based upon attributes of the audio-video content and previous recommendations.
- a method for sharing based on a face in an image where the method comprises steps of defining a sharing rule based on face identifying information and applying face identifying information associated with the image to the sharing rule to determine the one or more recipients with which the image should be shared.
- Such a method does provide users with an automated process of sharing imported pictures but has proved to be difficult to manage and maintain. This is because the rules are static in nature and require continuous maintenance. This is especially so in relation to the sharing and privacy of personal audio-video content since there are often many variables leading to complicated rules.
- Privacy settings may be defined which describe what actions should be performed on content taking into privacy aspects, for example, recommendations to be performed upon the content could comprise the backup, archiving, printing, deleting, duplicating or organizing. This has an analogy with the filtering of unwanted email, often termed Spam. Early email programs introduced rule based filtering, which all too soon became unmanageable for the average user. Returning once more to context of photographs an example will illustrate the difficulties in managing a set of pre-determined rules. Suppose that an apparatus has rule 1 that says: 1. If (photo includes Bob) then share with Alice, where Alice is Bob's wife. Then a new photo is taken that includes Bob and Brenda, where Brenda has a secret relationship with Bob.
- an apparatus for automatically determining privacy settings for content comprising: a content source for content; an intrinsic content analyser unit, communicatively coupled to the content source, the intrinsic content analyser unit being arranged to analyse and extract intrinsic information from the content; a privacy rules base comprising privacy setting rules; and a privacy engine, communicatively coupled to the intrinsic content analyser unit and the privacy rules base, the privacy engine being arranged to determine a recommended privacy setting for the content based upon the privacy setting rules and the intrinsic information extracted from the content; wherein the apparatus further comprises a privacy rules update unit, communicatively coupled to the intrinsic content analyser unit and to the privacy engine and to the privacy rules base, the privacy rules update unit being arranged to update the privacy setting rules based upon the recommended privacy setting for the content and the intrinsic information extracted from the content. In this way the apparatus learns from past decisions and improves the recommendations.
- the measure as defined in claim 2 has the advantage that the privacy setting rules are updated taking into account the complete history of recommended privacy settings determined by the privacy engine and the complete history of intrinsic information improving the quality of recommendations.
- the means defined in claim 3 are applied to allow a user to provide feedback to the apparatus on recommended privacy settings since the apparatus can further improve the quality of recommendations provided within a limited number of iterations taking into account subjective aspects of value to the user.
- a decision tree learning process to create a decision tree as basis for recommending and updating the privacy setting rules provides a simple, yet efficient, learning algorithm for improving the quality of recommendations based upon observations and previous decisions.
- a decision tree can be induced upon command from any set of historical information allowing a decision tree to be recovered, a new decision tree, or set thereof, to be created.
- the measures of claim 6 provide an apparatus that stores content taking into account the recommended privacy settings thereby allowing the future retrieval of the content.
- a secure storage unit may be applied ensuring that the recommended privacy settings are enforced.
- the measures defined in claim 8 allow users to present themselves to an apparatus and thereby store and retrieve content in a secure manner taking into account recommended privacy settings.
- Advantageously recommended privacy settings are determined based upon a wide range of intrinsic information as defined in claim 9 enabling the privacy engine and the privacy rules update unit to determine the most suitable privacy settings rules.
- the object is realized by a method as defined in claim 10. Further advantageous measures are defined in claims 11 through 15.
- a third aspect of the present invention provides a computer-readable recording medium containing a program to realize the object of the invention as defined in claim 16. According to a fourth aspect of the present invention the object is realized by providing a program for controlling an information processing apparatus as claimed in claim 17.
- FIG. 1 is a schematic diagram of a first embodiment of the present invention
- FIG. 2 is a table containing learning examples and a resulting recommended privacy setting
- FIG. 3 is a diagram showing the decision tree induced from the learning examples of FIG. 2;
- FIG. 4 is a flowchart diagram illustrating method steps performed in a first embodiment of the invention
- FIG. 5 is a schematic diagram of a second embodiment of the present invention
- FIG. 6 is a flowchart diagram illustrating method steps performed in a second embodiment of the invention
- FIG. 7 is a schematic diagram of a third embodiment of the present invention
- FIG. 8 is a flowchart diagram illustrating method steps performed in a third embodiment of the invention
- FIG. 9 is a schematic diagram of a fourth embodiment of the present invention.
- FIG.1. is a block diagram showing an apparatus for automatically determining the privacy settings for content, such as, audio-video content, photographs and other documents that is easy to manage.
- Audio-video content is sourced by, for example a camera 9, such as a digital still camera or video camera, and imported by an import unit 2.
- the actual method of sourcing the content is not essential and audio-video content can be retrieved by any known means, such as retrieved from a fixed or removable storage medium, or from a network such as the Internet.
- An intrinsic content analyser 3 analyses the content and extracts metadata or attributes denoted as intrinsic information 10 from the content by known means.
- Extracting embedded metadata may be performed according to a known metadata standard such as, for example, the EXIF standard, the MPEG 7 standard, the MusicPhoto Video (MPV) standard or any other proprietary solution.
- Useful metadata may comprise the location of creation, by the use of Global Positioning System (GPS) information, creation time and date, resolution, focal length, etc.
- GPS Global Positioning System
- Content analysis techniques may also be used to extract inherent information contained within the audio-video content. For example, face detection and face recognition can be performed to identify people contained within the audio-video content. Objects and locations, for example indoor/outdoor classification, can be detected by similar content analysis procedures.
- Low level features can further indicate useful attributes of the audio-video content that may be suitable attributes upon which the privacy settings of the audio-video content may be determined.
- a privacy engine 4 accepts as input the intrinsic information 10 and privacy setting rules. The privacy engine 4 determines a recommended privacy setting 8 based upon the intrinsic information 10 and the privacy setting rules. The privacy engine 4 can make use, for example, of a decision tree, though other machine learning techniques could be used. Such a decision tree is known from the prior art of machine learning.
- the privacy setting rules are stored in a privacy rules base 5 which is in fact the current valid set of privacy setting rules upon which recommendations are made. In a preferred embodiment the privacy rules base 5 would be protected by encryption in a secure manner.
- the privacy rules base 5 may also store multiple sets of privacy setting rules, based upon multiple hypotheses, upon which recommendations are made and may further continuously evaluate each set during operation choosing the most suitable at the time a decision must be made.
- the privacy engine 4 may be termed a performance element since it makes decisions based upon the incoming attributes of the audio-video content and the current valid set of privacy setting rules and so determines the performance of the recommendation.
- the recommended privacy setting 8 may be used to immediately share, for example, via a network, or store, on a storage unit 7, the audio-video content.
- the recommended privacy setting 8 may be used to recommend other actions to be performed upon the audio-video content, such as, the backup, archiving, printing, deleting, duplicating or organizing.
- a privacy rules update unit 6 accepts as input the recommended privacy setting 8 and the intrinsic information 10 and adapts the privacy setting rules contained within the privacy rules base 5 in such a manner that the privacy setting rules are also consistent with the latest recommendation.
- the privacy rules update unit 6 is therefore a learning element that modifies the behaviour of the privacy engine 4 to improve the recommendations and therefore the performance.
- the privacy rules update unit 6 generates a simple hypothesis consistent with the information, in doing so subscribing to the theory of Ockhams's razor.
- the hypothesis may be based upon a set of examples relating the recommended privacy setting 8 to the intrinsic information 10.
- the set of examples may be a pre-determined set of training data or be generated from the initial usage of the apparatus.
- the privacy rules update unit 6 therefore alleviates the need for a user to manually modify the privacy setting rules contained within the privacy rules base 5 and is therefore easier to manage. Furthermore, inconsistent or contradictory privacy setting rules are prevented further increasing the manageability of the privacy setting rules.
- FIG.2 provides an example set of learning examples or training data relating to the example as introduced earlier.
- the intrinsic information 10 or attributes are defined as Bob, Brenda and Colleagues.
- the recommended privacy setting 8 is defined as Share, with the resulting recommendation being Yes or No.
- the privacy rules update unit 6 determined the privacy setting rules that are shown graphically in FIG. 3.
- the decision tree is constructed by the privacy rules update unit 6 and may be based upon the amount of information according to Shannon and Weaver provided at each node in the decision tree. Other methods known in the prior art could, of course, be used to create the decision tree. Such methods may be based on gain ratios or GINI indexes.
- the information content is directly related to the probability of a certain outcome and is defined as
- P(V 1 ) defines the probability of v, occurring.
- the training set approximates the ground truth probabilities, P(V 1 ), and is given by
- n indicates the total number of negative outcomes
- p indicates the number of positive and negative outcomes respectively given attribute A.
- the information gained from a specific test upon an attribute, A is defined as the information gain and is
- the privacy engine 4 makes use of the privacy setting rules to make a recommendation to the user.
- the remainder can be calculated by using the dataset shown in FIG. 2, for example,
- Gain ⁇ Brenda l( — , — ]- 0.394
- the information gain for the Brenda attribute is 0.198, whereas the information gain for Bob and Colleagues both equate to only 0.128. Therefore, the Brenda node 11 shown in the privacy setting rules of FIG. 3 has the highest information content.
- the privacy rules update unit 6 therefore chooses the Brenda node 11 as the first decision point of the decision tree 16.
- the left branch 17 from the Brenda node 11 always results in a Yes decision, therefore there is no information provided by the attributes Bob or Colleagues.
- the privacy rules update unit 6 determines that the information gain for Colleagues is 0.918, whilst for Bob it is only 0.251, therefore the Colleagues node 13 is created. Thereafter, the attribute Bob does not provide any further information gain. Therefore no further branches are necessary.
- the privacy engine 4 determines the recommended privacy setting 8 based upon the privacy setting rules. Assuming that the intrinsic information 10 implies that the photo contained Bob and Brenda, but no Colleagues then the privacy engine 4 would traverse the decision tree 16 of FIG.3. Firstly, at the Brenda node 11 since Brenda is present the right branch 12 would be taken. At the Colleagues node 13 again the right branch 14 would be taken. The recommended privacy setting 8 is therefore provided by the leaf 15, which in this example would be a recommendation not to share the photo, which in this case is probably the correct recommendation given the circumstances.
- FIG. 4 shows a flowchart comprising the method steps of the invention.
- the flowchart of FIG. 4 is helpful when, as is common in the consumer electronic industry at this time, the invention is implemented in software making use of a processor and memory, as is well known to the skilled person.
- audio-video content is received by any means well known to the skilled person, thereafter at step 41 the audio-video content is analysed to extract the intrinsic information 10.
- the recommended privacy setting is determined as already described, based upon the privacy setting rules.
- the privacy setting rules are updated in preparation for the next recommendation. The next recommendation is made by returning to step 40.
- a privacy setting history 50 provides a location to store a historical collection of recommended privacy settings 8.
- an intrinsic information history 51 provides a location to store a historical collection of intrinsic information 10, or attributes, extracted from the audio-video content.
- the privacy rules update unit 6 is then in a position to re-evaluate the complete set of privacy setting rules comprised in the privacy rules base 5 after each new recommendation taking into account all information previously encountered.
- a subset of the previous historical information may also be used to reduce the storage requirements of the historical information.
- FIG. 6 details the method steps required in processing audio-video content as performed by an embodiment such as that of FIG. 5.
- audio-video content is received in an identical manner to that described in FIG.4.
- the audio-video content is analysed to extract the intrinsic information 10.
- the recommended privacy setting 8 is, as usual, determined based upon the privacy setting rules.
- the recommended privacy setting 8 is stored in a privacy setting history 50.
- the intrinsic information 10 is stored in an intrinsic information history 51.
- the privacy setting rules are updated in preparation for the next recommendation.
- the updating in step 43 may comprise an analysis of the complete historical information available or be limited to a subset thereof.
- the next recommendation may be made by returning to step 40.
- a third embodiment disclosed in FIG. 7 will now be elucidated upon. In FIG.
- the recommended privacy setting 8 is presented to a user 72 by means of a display means 70.
- the display means 70 can, for example, be a monitor, a television screen or a simple display device.
- the user 72 is invited to give user feedback 73 via an input means 71 on the recommended privacy setting 8 as determined by the privacy engine 4.
- Such feedback on the ground truth is useful in the learning process and can dramatically improve the quality of the recommended privacy setting 8. It is, of course, not necessary that the user 72 be consulted on all recommendations.
- the privacy setting history 50 and the intrinsic information history 51 may be statistically analysed to produce a confidence level in the recommended privacy setting 8 and only in cases of low confidence would the user be troubled with supplying user feedback 73.
- FIG. 7 also has a well-defined process flow that is described with reference to the flowchart of FIG. 8.
- audio-video content is received and in step 41 the audio-video content is analysed to extract the intrinsic information 10. Both of these steps are performed in the usual manner as described earlier.
- the recommended privacy setting 8 is determined based upon the privacy setting rules, also in the usual manner.
- the recommended privacy setting 8 is presented to a user 72 by means of a display means 70.
- user feedback 73 is received.
- the privacy setting rules are again updated in preparation for the next recommendation.
- the updating in step 43 may further comprise an analysis of the user feedback 73 allowing the recommended privacy setting 8 to be improved, for example, by modification.
- the next recommendation may be made by returning to step 40 in the usual manner.
- FIG. 9 shows a fourth embodiment in which the storage unit 7 is a secure storage unit 92.
- the secure storage unit 92 may accept as input for storage the audio-video content, the recommended privacy setting 8, the intrinsic information 10 and may further comprise a secure channel 91 communicatively coupled to a secure user identification unit 90.
- a secure user identification unit 90 will be a smartcard, an input means for a personal identification number or username and password combination or even a biometric device registering fingerprints. Any known secure storage unit means may be used as known to the skilled person.
- the secure storage unit 92 may further comprise an encryptor and controller for managing access rights and communicating securely via the secure channel 91 to the secure user identification unit 90.
- FIG. 9 shows a fourth embodiment in which the storage unit 7 is a secure storage unit 92.
- the secure storage unit 92 may accept as input for storage the audio-video content, the recommended privacy setting 8, the intrinsic information 10 and may further comprise a secure channel 91 communicatively coupled to a secure user identification unit 90.
- the invention may also be embodied as a computer program product, storable on a storage medium and enabling a computer to be programmed to execute the method according to the invention.
- the computer can be embodied as a general-purpose computer like a personal computer or network computer, but also as a dedicated consumer electronics device with a programmable processing core.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Bioethics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Hardware Design (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Storage Device Security (AREA)
Abstract
L'invention concerne un appareil et un procédé permettant de déterminer automatiquement des paramètres de confidentialité pour un contenu, tel qu'un contenu audio-vidéo, des photographies et d'autres documents. L'appareil (1) comprend une source de contenu audio-vidéo (2), un analyseur de contenu intrinsèque (3) qui est connecté de façon communicative avec une source audiovisuelle (2), telle qu'une source d'image ou de film (9). Le contenu est analysé en vue d'y trouver des informations intrinsèques (10). De plus, le système comprend un moteur de confidentialité (4) connecté de façon communicative à l'analyseur de contenu intrinsèque. Un paramètre de confidentialité recommandé (8) est déterminé sur la base de règles de paramètres de confidentialité issues d'une base de règles de confidentialité (5) et des informations intrinsèques (10). Les règles de paramètres de confidentialité sont mises à jour par une unité de mise à jour des règles de confidentialité (6), en prenant en compte le paramètre de confidentialité recommandé (8) et les informations intrinsèques (10). Le contenu audio-vidéo est ensuite stocké dans une unité de stockage (7) conformément au paramètre de confidentialité recommandé (8).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP05107198 | 2005-08-04 | ||
EP05107198.3 | 2005-08-04 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2007015184A2 true WO2007015184A2 (fr) | 2007-02-08 |
WO2007015184A3 WO2007015184A3 (fr) | 2007-05-31 |
Family
ID=37570456
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2006/052482 WO2007015184A2 (fr) | 2005-08-04 | 2006-07-20 | Appareil et procede permettant de determiner automatiquement des parametres de confidentialite pour un contenu |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2007015184A2 (fr) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010058061A1 (fr) * | 2008-11-18 | 2010-05-27 | Nokia Corporation | Procédé, appareil et produit-programme informatique pour déterminer les réglages de confidentialité d’objets média |
US8234688B2 (en) * | 2009-04-03 | 2012-07-31 | International Business Machines Corporation | Managing privacy settings for a social network |
US20130174213A1 (en) * | 2011-08-23 | 2013-07-04 | James Liu | Implicit sharing and privacy control through physical behaviors using sensor-rich devices |
US9153195B2 (en) | 2011-08-17 | 2015-10-06 | Microsoft Technology Licensing, Llc | Providing contextual personal information by a mixed reality device |
US9443098B2 (en) | 2012-12-19 | 2016-09-13 | Pandexio, Inc. | Multi-layered metadata management system |
US9491258B2 (en) | 2014-11-12 | 2016-11-08 | Sorenson Communications, Inc. | Systems, communication endpoints, and related methods for distributing images corresponding to communication endpoints |
US9536350B2 (en) | 2011-08-24 | 2017-01-03 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9767524B2 (en) | 2011-08-09 | 2017-09-19 | Microsoft Technology Licensing, Llc | Interaction with virtual objects causing change of legal status |
US9773000B2 (en) | 2013-10-29 | 2017-09-26 | Pandexio, Inc. | Knowledge object and collaboration management system |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US10747898B2 (en) | 2016-10-20 | 2020-08-18 | International Business Machines Corporation | Determining privacy for a user and a product in a particular context |
US10789656B2 (en) | 2009-07-31 | 2020-09-29 | International Business Machines Corporation | Providing and managing privacy scores |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030161499A1 (en) * | 2002-02-28 | 2003-08-28 | Hugh Svendsen | Automated discovery, assignment, and submission of image metadata to a network-based photosharing service |
US20030226038A1 (en) * | 2001-12-31 | 2003-12-04 | Gil Raanan | Method and system for dynamic refinement of security policies |
US20040268251A1 (en) * | 2003-06-30 | 2004-12-30 | Vladimir Sadovsky | System and method for rules-based image acquisition |
-
2006
- 2006-07-20 WO PCT/IB2006/052482 patent/WO2007015184A2/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030226038A1 (en) * | 2001-12-31 | 2003-12-04 | Gil Raanan | Method and system for dynamic refinement of security policies |
US20030161499A1 (en) * | 2002-02-28 | 2003-08-28 | Hugh Svendsen | Automated discovery, assignment, and submission of image metadata to a network-based photosharing service |
US20040268251A1 (en) * | 2003-06-30 | 2004-12-30 | Vladimir Sadovsky | System and method for rules-based image acquisition |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8301659B2 (en) | 2008-11-18 | 2012-10-30 | Core Wireless Licensing S.A.R.L. | Method, apparatus, and computer program product for determining media item privacy settings |
US9058501B2 (en) | 2008-11-18 | 2015-06-16 | Core Wireless Licensing S.A.R.L. | Method, apparatus, and computer program product for determining media item privacy settings |
WO2010058061A1 (fr) * | 2008-11-18 | 2010-05-27 | Nokia Corporation | Procédé, appareil et produit-programme informatique pour déterminer les réglages de confidentialité d’objets média |
US8234688B2 (en) * | 2009-04-03 | 2012-07-31 | International Business Machines Corporation | Managing privacy settings for a social network |
US10789656B2 (en) | 2009-07-31 | 2020-09-29 | International Business Machines Corporation | Providing and managing privacy scores |
US9767524B2 (en) | 2011-08-09 | 2017-09-19 | Microsoft Technology Licensing, Llc | Interaction with virtual objects causing change of legal status |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
US9153195B2 (en) | 2011-08-17 | 2015-10-06 | Microsoft Technology Licensing, Llc | Providing contextual personal information by a mixed reality device |
US10223832B2 (en) | 2011-08-17 | 2019-03-05 | Microsoft Technology Licensing, Llc | Providing location occupancy analysis via a mixed reality device |
US20130174213A1 (en) * | 2011-08-23 | 2013-07-04 | James Liu | Implicit sharing and privacy control through physical behaviors using sensor-rich devices |
US9536350B2 (en) | 2011-08-24 | 2017-01-03 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US11127210B2 (en) | 2011-08-24 | 2021-09-21 | Microsoft Technology Licensing, Llc | Touch and social cues as inputs into a computer |
US9881174B2 (en) | 2012-12-19 | 2018-01-30 | Pandexio, Inc. | Multi-layered metadata management system |
US9443098B2 (en) | 2012-12-19 | 2016-09-13 | Pandexio, Inc. | Multi-layered metadata management system |
US9773000B2 (en) | 2013-10-29 | 2017-09-26 | Pandexio, Inc. | Knowledge object and collaboration management system |
US10592560B2 (en) | 2013-10-29 | 2020-03-17 | Pandexio, Inc. | Knowledge object and collaboration management system |
US9959014B2 (en) | 2014-11-12 | 2018-05-01 | Sorenson Ip Holdings, Llc | Systems, communication endpoints, and related methods for distributing images corresponding to communication endpoints |
US9491258B2 (en) | 2014-11-12 | 2016-11-08 | Sorenson Communications, Inc. | Systems, communication endpoints, and related methods for distributing images corresponding to communication endpoints |
US10747898B2 (en) | 2016-10-20 | 2020-08-18 | International Business Machines Corporation | Determining privacy for a user and a product in a particular context |
Also Published As
Publication number | Publication date |
---|---|
WO2007015184A3 (fr) | 2007-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2007015184A2 (fr) | Appareil et procede permettant de determiner automatiquement des parametres de confidentialite pour un contenu | |
US8358811B2 (en) | Method and apparatus to incorporate automatic face recognition in digital image collections | |
JP5801395B2 (ja) | シャッタクリックを介する自動的メディア共有 | |
US8473525B2 (en) | Metadata generation for image files | |
US10043059B2 (en) | Assisted photo-tagging with facial recognition models | |
CN110263642B (zh) | 用于替换图像的部分的图像缓存 | |
EP2973013B1 (fr) | Association de métadonnées à des images d'une collection d'images personnelles | |
US9495583B2 (en) | Organizing images by correlating faces | |
US8032539B2 (en) | Method and apparatus for semantic assisted rating of multimedia content | |
CN112860943A (zh) | 一种教学视频审核方法、装置、设备及介质 | |
US20160275526A1 (en) | Time Based Anomaly Analysis in Digital Documents | |
CN104813674A (zh) | 用于优化视频的系统和方法 | |
CN109583228B (zh) | 一种隐私信息管理方法、装置和系统 | |
US11783072B1 (en) | Filter for sensitive data | |
US20130066872A1 (en) | Method and Apparatus for Organizing Images | |
US20230205812A1 (en) | Ai-powered raw file management | |
US20230113131A1 (en) | Self-Supervised Learning of Photo Quality Using Implicitly Preferred Photos in Temporal Clusters | |
EP3877870B1 (fr) | Systèmes informatiques et procédés pour cataloguer, récupérer et organiser du contenu produit par l'utilisateur associé à des objets | |
CN118093779A (zh) | 变更单的查询方法及装置 | |
CN118551110A (zh) | 车辆壁纸推荐方法、装置、电子设备及存储介质 | |
CN117590981A (zh) | 显示方法、装置、电子设备和可读存储介质 | |
Pavithra et al. | Sharing of Images in Content Sharing Sites Based on User Profile Inferences |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase in: |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06780142 Country of ref document: EP Kind code of ref document: A2 |