US20090300525A1 - Method and system for automatically updating avatar to indicate user's status - Google Patents

Method and system for automatically updating avatar to indicate user's status Download PDF

Info

Publication number
US20090300525A1
US20090300525A1 US12/127,349 US12734908A US2009300525A1 US 20090300525 A1 US20090300525 A1 US 20090300525A1 US 12734908 A US12734908 A US 12734908A US 2009300525 A1 US2009300525 A1 US 2009300525A1
Authority
US
United States
Prior art keywords
avatar
user
mobile device
selection criteria
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/127,349
Inventor
Maria Elena Romera JOLLIFF
Samuel Jacob HORODEZKY
Tia CHUNG
Kameron Kerger
Gregory James BROWN
Todd Jeffrey JOHNSGARD
Joseph Jyh-Huei HUANG
Ankur JALOTA
Devender YAMAKAWA
Jadine Naomi YEE
Scott Alan LEAZENBY
Chad Andrew WILLKIE
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Priority to US12/127,349 priority Critical patent/US20090300525A1/en
Assigned to QUALCOMM INCORPORATED reassignment QUALCOMM INCORPORATED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROWN, GREGORY JAMES, KERGER, KAMERON, WILLKIE, CHAD ANDREW, YEE, JADINE NAOMI, HORODEZKY, SAMUEL JACOB, JOHNSGARD, TODD JEFFREY, ROMERA JOLLIFF, MARIA ELENA, CHUNG, TIA, HUANG, JOSEPH JYH-HUEI, JALOTA, ANKUR, YAMAKAWA, DEVENDER, LEAZENBY, SCOTT ALAN
Priority to KR1020107029200A priority patent/KR20110014224A/en
Priority to EP09755612A priority patent/EP2294802A1/en
Priority to JP2011511693A priority patent/JP5497015B2/en
Priority to CN201610090062.5A priority patent/CN105554311A/en
Priority to CN2009801187120A priority patent/CN102037716A/en
Priority to PCT/US2009/043542 priority patent/WO2009146250A1/en
Publication of US20090300525A1 publication Critical patent/US20090300525A1/en
Priority to JP2013229820A priority patent/JP2014059894A/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72451User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to schedules, e.g. using calendar applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/2866Architectures; Arrangements
    • H04L67/30Profiles
    • H04L67/306User profiles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/352Details of game servers involving special game server arrangements, e.g. regional servers connected to a national server or a plurality of servers managing partitions of the game world
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/52User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail for supporting social networking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/54Presence management, e.g. monitoring or registration for receipt of user log-on information, or the connection status of the users
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/083Network architectures or network communication protocols for network security for authentication of entities using passwords
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72457User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to geographic location
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Definitions

  • the present invention relates generally to providing a current indication of a user's status or activity via a computer generated avatar.
  • an avatar is a virtual representation of a computer user.
  • avatar can also refer to the personality connected with a screen name, or handle, of an Internet user.
  • Avatars are often used to represent the real world user in the virtual world of computing.
  • Avatars can be three-dimensional models used in virtual reality applications and computer games.
  • Avatars can also be a two-dimensional icon (picture) used in Internet forums and other online communities, instant messaging, gaming and non-gaming applications.
  • Avatars may be animated or static.
  • avatar dates at least as far back as 1985, when it was used as the name for the player character in a series of computer games. Recently, the usage of avatars has spread in popularity and avatars are now often used in Internet forums. Avatars on Internet forums serve the purpose of representing users and their actions, personalizing their contributions to the forum, and may represent different parts of their persona, beliefs, interests or social status in the forum.
  • the traditional avatar system used on most Internet forums is a small (96 ⁇ 96 to 100 ⁇ 100 pixels, for example) square-shaped area close to the user's forum post, where the avatar is placed.
  • Some forums allow the user to upload an avatar image that may have been designed by the user or acquired from elsewhere.
  • Other forums allow the user to select an avatar from a preset list or use an auto-discovery algorithm to extract one from the user's homepage.
  • avatars In the instant messaging (IM) context, avatars, sometimes referred to as buddy icons, are usually small images. For example, IM icons are 48 ⁇ 48 pixels, although many icons can be found online that typically measure anywhere from 50 ⁇ 50 pixels to 100 ⁇ 100 pixels in size. A wide variety of these imaged avatars can be found on web sites and popular eGroups such as Yahoo! Groups. The latest use of avatars in instant messaging is dominated by dynamic avatars. The user chooses an avatar that represents him while chatting and, through the use of text to speech technology, enables the avatar to talk the text being used at the chat window. Another form of use for this kind of avatar is for video chats/calls. Some services, such as Skype (through some external plug-ins) allow users to use talking avatars during video calls, replacing the image from the user's camera with an animated, talking avatar.
  • Skype through some external plug-ins
  • Embodiments may receive information from a variety of sensors located either within the user's mobile device or within close proximity to the mobile device to provide some parameters of the user's real world environment.
  • the variety of sensors may include, but are not limited to a location sensor (e.g., GPS coordinates), a microphone for sensing ambient noise, a camera or light sensor for sensing ambient light, accelerometers, temperature sensor, and bio-physiological sensors such as a breathalyzer, heart rate monitor, pulse sensor, EEG, ECG, EKG, and/or blood pressure sensor.
  • embodiments may utilize a user's calendar data as well as mobile device settings to generate an updated virtual representation via an avatar of the user's real world status or activity.
  • Alternative embodiments may age the user's avatar over time so that a user's avatar grows older, more mature as the user grows older, more mature.
  • Various embodiments automatically update or change the user's avatar as the user goes about his/her daily activities.
  • Other embodiments update or change the user's avatar when a request to view the avatar is made.
  • the user's avatar may be viewed in a singular location, such as a webpage.
  • Alternative embodiments may allow a user's avatar to be downloaded to any requesting party.
  • Still other embodiments may pro-actively inform selected parties of a user's current real world status or activity by sending an avatar.
  • FIG. 1 illustrates exemplary avatars suitable for use with the various embodiments.
  • FIG. 2 is system block diagram of a system suitable for use with the various embodiments.
  • FIG. 3 is a system block diagram of a mobile device suitable for use with the various embodiments.
  • FIG. 4 is a process flow diagram of an embodiment method suitable for implementation on the system.
  • FIG. 5 is a process flow diagram of a specific embodiment method suitable for implementation on a mobile handset.
  • FIG. 6 a is an example parameter data table suitable for storing a variety of sensor data, user calendar data, and mobile device settings indicating the current status of the user.
  • FIG. 6 b is an illustrative avatar selection logic table which indicates an avatar to display based on various parameters.
  • FIG. 6 c is a process flow diagram of an embodiment method for calibrating an avatar selection logic table.
  • FIG. 7 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which conserves battery and processor time.
  • FIG. 8 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which responds to a server request.
  • FIG. 9 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which responds to a second user request.
  • FIG. 10 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server.
  • FIG. 11 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server which conserves battery and processor time.
  • FIG. 12 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server which conserves battery and processor time by responding to a server request.
  • FIG. 13 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server which conserves battery and processor time by responding to a second user request.
  • FIG. 14 a is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device.
  • FIG. 14 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar directly on the requesting device
  • FIG. 15 a is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device which conserves battery and processor time.
  • FIG. 15 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar directly on the requesting device which conserves battery and processor time.
  • FIG. 16 a is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device which conserves battery and processing time by responding to a second user request.
  • FIG. 16 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar directly on the requesting device which conserves battery and processing time by responding to a second user request.
  • FIG. 17 is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device wherein avatar selection is offloaded to the requesting user's device.
  • FIG. 18 is a process flow diagram of an alternative embodiment method suitable for implementation on the system.
  • FIG. 19 a is an example parameter data table suitable for storing a variety of sensor data, user calendar data, mobile device settings and authorization level of a user requesting an avatar.
  • FIG. 19 b is an illustrative avatar selection logic table which indicates an avatar to display based on various parameters including the authorization level of the requesting user.
  • FIG. 19 c is a process flow diagram of an embodiment method for calibrating an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 20 is a process flow diagram of an embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 21 is a process flow diagram of another embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 22 is a process flow diagram of another embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 23 is a process flow diagram of another embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 24 a is a process flow diagram of another embodiment method suitable for displaying an avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 24 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 25 a is a process flow diagram of another embodiment method suitable for displaying an avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 25 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 26 is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device based upon sensor and settings data and the second user's authorization level.
  • the term mobile device may refer to any one or all of cellular telephones, personal data assistants (PDA's), palm-top computers, laptop computers, wireless electronic mail receivers (e.g., the Blackberry® and Treo® devices), multimedia Internet enabled cellular telephones (e.g., the iPhone® ), and similar personal electronic devices which include a programmable processor and memory.
  • the mobile device is a cellular handset that can communicate via a cellular telephone network (e.g., a cellphone).
  • cellular telephone communication capability is not necessary in all embodiments.
  • wireless data communication may be achieved by the mobile device connecting to a wireless data network (e.g., a WiFi network) instead of a cellular telephone network.
  • server refers to any of a variety of commercially available computer systems configured to operate in a client-server architecture.
  • server refers to network servers, particularly Internet accessible servers, which typically include a processor, memory (e.g., hard disk memory), and network interface circuitry configured to connect the server processor to the network, such as the Internet.
  • the term “theme” refers to the collection of user-configurable settings that may be implemented on a mobile handset to personalize the mobile handset to the user's preferences.
  • a theme is defined by the files and settings used for any and all of the wallpaper (i.e., images presented on the mobile handset display), ring tones (i.e., audio files triggered by different events), ring settings (e.g., loud, medium, soft and silent, as well as vibrate mode), button tones (i.e., tones played when buttons are pressed), button functions, display icons, and speed dial settings (i.e., the telephone number associated with each button configured for speed dialing).
  • the wallpaper i.e., images presented on the mobile handset display
  • ring tones i.e., audio files triggered by different events
  • ring settings e.g., loud, medium, soft and silent, as well as vibrate mode
  • button tones i.e., tones played when buttons are pressed
  • button functions i.e., tones
  • a theme may also include settings for other user-configurable settings like password protections, keypad locks, carrier selections, etc. Composed of such data, a theme can be stored as a mix of files (e.g., image and audio files), as well as configuration data (e.g., telephone numbers associated with particular speed dial buttons).
  • files e.g., image and audio files
  • configuration data e.g., telephone numbers associated with particular speed dial buttons
  • Avatars have gained increasing popularity in use as graphical representations of an individual.
  • An avatar can be text (such as a screen name) or a two or three-dimensional graphical representation (e.g., a photograph, cartoon or machine-generated image).
  • Avatars can be static images or dynamic (animated) images. Examples of some avatars are illustrated in FIG. 1 .
  • avatars can be images which graphically communicate information about the associated individual, such as professions, hobbies, current activities and moods.
  • users have used an avatar as a representation of the user or the user's persona in online gaming or internet forum chats.
  • Avatars can efficiently convey information about the user, such as the user's interests, just by nature of the avatar.
  • a user would select an avatar to display a representation of the user participating in an online game, internet forum chat, SMS chat, etc.
  • the selected avatar file may be used to represent the real world user in the virtual world of computing or electronic telecommunications.
  • the various embodiments disclosed herein enable users to generate or post an avatar that more closely represents in a virtual world the real world status of the user.
  • Mobile devices particularly cellular telephones, are practically ubiquitous and indispensible. Consequently, mobile devices can be ideal platforms for housing sensors that can measure the environment and activities of user.
  • computer systems can infer user activities, information that can be used in the various embodiments to update users' avatars to reflect their real world activities. For example, a user's mobile device may “learn” that the user's current GPS location is a conference room in the office. Accordingly, the mobile device may automatically set the mobile device to vibrate mode and also automatically update the user's avatar to depict “do not disturb.”
  • the various embodiments incorporate or make use of a variety of sensors housed in a user's mobile device, and use the sensor information to update or generate avatars which can be displayed to reflect the user's status, location, mood and/or activity.
  • the various embodiments may employ a variety of sensors and access schedule or calendar information maintained within the mobile device to more accurately reflect the user's status.
  • Such an avatar may be made available for public or private viewing, in order to quickly inform viewers of the user's current status, location, mood and/or activity.
  • Such avatars may be sent to others proactively, such as appended to or included within an SMS or e-mail message, or posted to a server where others can access or download the avatar, such as by accessing an on-line game or website where the avatar is maintained.
  • Avatars may be changed according to users' status on a pre-scheduled basis (e.g., periodic updating), whenever the user's status is requested (e.g., in response to a request for an avatar), or whenever the user's status changes (e.g., when sensors indicate the user's location, mood and/or activity have changed.
  • FIG. 2 is a system block diagram of a computing and telecommunications system suitable for use with various embodiments disclosed herein.
  • the system block diagram illustrates an exemplary cellular network system, but is not intended to contemplate all possible configurations.
  • a variety of users, engaged in a variety of activities may have on their person a variety of mobile devices.
  • a first user may be using a laptop computer 101 in a coffee shop.
  • a second user may be shopping at an airport carrying a PDA 102 .
  • a third user may be driving a car with a cell phone 103 .
  • a fourth user may be conducting a meeting and carrying a cell phone 104 .
  • Each of the mobile devices 101 , 102 , 103 , and 104 may communicate via wireless networks with wireless communication base stations 105 .
  • the wireless communication base stations 105 may be a cell phone tower, Bluetooth receiver, WiFi receiver, or any other wireless transceiver.
  • the devices communicate via cellular telephone and data networks with a cellular base station antenna 105 .
  • the wireless communication base stations 105 may be coupled to a router 106 and wireless communication server 107 to connect to the Internet 108 .
  • Alternative paths to the Internet 108 may involve more or less communication equipment. For example, some wireless service providers may require additional servers to provide their users with access to the Internet 108 .
  • the wireless communication base station may permit a more direct link to the Internet 108 .
  • Users of mobile devices may communicate with one another or with any other user connected through the Internet 108 .
  • a first user may send an email from his laptop 101 to a user at desktop computer 113 , a user of a PDA 114 , a user of a laptop computer 115 , or other users via their cell phones 116 , 117 .
  • the user would send an email from the laptop 101 which would be wirelessly transmitted to the base station 105 .
  • the email would be sent via a router 106 to a server 107 , across the Internet 108 to a server 110 servicing the intended recipient's computing device to a router 111 where it might be sent via a wired connection to a desktop computer 112 or via a wireless base station 112 to a mobile device 114 - 117 .
  • the recipient can reply or initiate communications to the user in the reverse manner.
  • Mobile device users may be unavailable to respond to incoming messages from time to time and may wish to provide some indication of their current status to explain why they are non-responsive.
  • mobile device users may want to inform others as to their current status so that others can know if they are available to communicate.
  • some users may wish to inform their friends and family of their current status and activities as part of their social networking lifestyle. Such notification of a user's status may be accomplished efficiently using an avatar that can be accessed by or presented to selected individuals.
  • Such an avatar may be maintained and displayed, for example, on the user's social networking webpage (e.g., myspace.com, facebook.com, etc) or any other webpage maintained on an Internet accessible server.
  • the avatar, along with the contents of the webpage and data contained therein, may be stored in the memory of a server 109 .
  • the server 109 is connected to the Internet 108 and may be accessed by devices with Internet 108 capabilities and proper access rights.
  • users may access a person's avatar, such as by accessing a webpage to display the current status avatar, prior to communicating (e.g., calling, sending an e-mail or sending an SMS message).
  • a user may be automatically directed to the webpage containing the current status avatar if the person does not respond or if the person has selected a “do not disturb” option.
  • an avatar file may be automatically and directly sent back to a user's device 113 - 117 for display.
  • avatar files may be proactively sent to a pre-approved list of recipients whenever a user's status changes, or on regularly scheduled or predetermined intervals.
  • a link to a user's webpage including an avatar may be sent to a pre-approved list of recipients whenever a user's status changes, or on regularly scheduled or predetermined intervals.
  • server 109 may not be necessary if avatar files are being sent directly to a user's device 113 - 117 .
  • avatars may be hosted on a variety of servers 107 , 110 , and need not be limited to a particular server 109 .
  • an embodiment enables users to override the automatic updating in order to select a particular avatar regardless of the user's current status.
  • users may elect to have their avatars reflect their current status at some times, while selecting particular avatars at other times.
  • the user may also set permission or authorization levels for avatars to control who may view a particular avatar, at what times, and during what activities. For example, a user may elect to enable the user's supervisor to view a particular avatar only during work hours. Outside work hours the avatar may be hidden, not transmitted, or reflect a general status, such as busy.
  • users can set the information and avatars that are public. Such embodiments may be used in updating an avatar on a website or in a general broadcast.
  • Information from sensors within a user's mobile device can be used to determine how an avatar should be updated.
  • Each of the mobile devices 101 - 104 may include a variety of sensors which can provide information used to determine the respective users' status. Examples of various sensors will be described in more detail below.
  • the day, time, and mobile device settings may be considered to provide further information regarding the respective users' status. Further, the mobile device's own operating status may provide information on the user's status.
  • a user regularly visits a coffee shop and uses the shops' wireless WiFi network this status could be determined based upon (1) the time of day and day of the week (TOD/DOW), particularly if the breaks are regular or scheduled, (2) activation of the WiFi transceiver, perhaps in combination with TOD/DOW information, (3) GPS (or other location sensor) coordinates, perhaps in combination with TOD/DOW information and activation of the WiFi transceiver, (4) background noise picked up by the device's microphone (e.g., the unique sound of an espresso machine), perhaps in combination with TOD/DOW information, activation of the WiFi transceiver, and GPS coordinate information.
  • TOD/DOW time of day and day of the week
  • GPS or other location sensor
  • Other sensors may also confirm the user's status is consistent with a coffee break, including accelerometers (indicating little if any motion) and temperature (indicating an ambient temperature consistent with an indoor location). If the user skipped his/her coffee break, was home sick or on vacation, an avatar display based solely on TOD/DOW information would inaccurately depict the user's current status.
  • the system can determine if the user is within (or close to) the coffee shop location. Using background noise sensing, the system may confirm a coffee break status recognizing espresso machine noise.
  • a system (whether the user's mobile device 101 or a server 109 receiving information from the device 101 ) can select or generate an avatar that graphically depicts this status.
  • the avatar could be altered to reflect the user's ambient environment. This may suggest to someone viewing the avatar that a non-verbal means of communication (e.g., text message) may be the best form of communication if someone wanted to contact the user.
  • background noise may be monitored (e.g., using the mobile device's microphone) for music and other sounds that may be used to infer the user's mood. For example, if the background noise includes music with added up-tempo beat, an avatar expressing a happy mood may be selected. As this example illustrates, by increasing the number of sensors used and the variety of information considered, a system can better infer the user's current status.
  • a user status of flying in an airplane may be determined by comparing TOD/DOW information to the user's Calendar information. If the user is scheduled to be on an airplane flight and there is no contrary information (e.g., an open WiFi connection with the user's mobile device 102 ), a system (whether the user's mobile device 101 or a server 109 receiving information from the device 101 ) can select or generate an avatar that graphically depicts this status. If airlines begin allowing mobile devices to communicate during flights, the mobile device 102 may also report sensor information consistent with airline travel to permit confirmation of the status. For example, constantly changing GPS data from the mobile device 102 , accelerometer data and/or barometric pressure data from a sensor on the mobile device 102 may all be used to confirm airline travel.
  • a user driving or riding in a car may be determined based upon TOD/DOW (e.g., the time and day being consistent with rush hour) in combination with GPS location information and accelerometer sensor readings. Constantly changing GPS data and accelerometer data from the mobile device 103 may be used to confirm that the user's status as driving.
  • TOD/DOW e.g., the time and day being consistent with rush hour
  • the status of a user of a cell phone 104 in a business meeting may be inferred from TOD/DOW information compared against the user's Calendar.
  • the TOD/DOW and calendar information may be maintained within the cell phone 104 and/or a server 109 .
  • Such a preliminary determination can be confirmed by considering other sensor information, such as GPS location information, and accelerometer and temperature sensor readings, and the cell phone 104 operating settings. For example, if the user has switched his mobile device 104 to vibrate or silent ring, these settings are consistent with the user being in a meeting and help to confirm the user's status.
  • the accelerometer readings show little significant movement and the temperature is consistent with an indoor location
  • this information can be used to confirm that the user is in a meeting. With the user's status so inferred, an appropriate avatar file can be selected or generated.
  • a user of a cell phone 104 may choose to display a set avatar while on vacation (e.g., an avatar sleeping in a hammock) instead of displaying current status.
  • the cell phone 104 may be configured to not relay sensor data or communicate a set avatar, or a server 109 may be configured to ignore sensor data and TOD/DOW information and display a selected avatar.
  • GPS location information may override TOD/DOW plus calendar information, such as when the GPS location is different from the location indicated by the calendar information. Logic rules may be generated to deal with such contradictory information.
  • user inputs and settings regarding current status may override all sensor, phone settings, or calendar parameter data.
  • FIG. 3 illustrates a component block diagram of an embodiment of a mobile device suitable for use in the overview system.
  • a typical mobile device 301 may include a microprocessor 391 , a memory 392 , an antenna 394 , a display 393 , an alphanumeric keypad 396 , a 4-way menu selector rocker switch 397 , a speaker 388 , a microphone 389 , a vocoder 399 , a receiver 395 , a transmitter 398 , (together a cellular network transceiver) and various circuits, busses and electrical interconnections among these components.
  • the mobile device 301 may include an ambient noise sensor 350 which is connected to the microphone 389 to detect ambient noise levels.
  • the ambient noise sensor 350 may also function as a microphone for speaker phone operations.
  • the mobile device 301 may also include a camera 351 which in addition to taking pictures can be used to measure ambient light levels.
  • the mobile device may also contain an ambient temperature sensor 352 and one or more accelerometers 353 detecting the relative acceleration of the mobile device 301 .
  • the mobile device may also include a GPS receiver circuit 354 which is configured to receive signals from GPS satellites to determine the precise global position of the mobile device 301 .
  • the mobile device may also include a breathalyzer sensor 355 which is configured to measure a blood alcohol content (BAC) from a user's exhaled breath.
  • Other biometric sensors may also be included, such as, for example, a blood pressure monitor, pulse rate sensor, EEG, ECG, EKG, etc.
  • a user's avatar may, for example, be displayed to be concentrating if the EEG sensor indicates brainwave patterns consistent with concentration levels.
  • the user's avatar may be displayed to indicate the user in a relaxed mental state if the EEG sensor indicates brainwave patterns consistent with relaxation levels.
  • Each of the mobile device 301 sensors 350 - 356 are connected to the processor 391 , which is in turn connected to an internal memory unit 392 .
  • the processor 391 may collect parameter data from the various sensors 350 - 356 and may store the data in memory unit 392 or transmit the data via transmitter 398 .
  • mobile device 301 is depicted in FIG. 3 as a mobile handset or cell phone, the system blocks may be found in any mobile device with wireless communication capability. In this way, other mobile devices such as a laptop computer, PDA or similar devices may be represented.
  • the various sensors illustrated in FIG. 3 can be used to more accurately infer a user's status and activities. For example, if the breathalyzer sensor 355 determines that the user's BAC is above 0.1%, indicating the user may be alcohol-impaired, the user's avatar could be selected or generated to indicate the impairment. As another example, if the accelerometer senses rhythmic motion consistent with jogging, an avatar may be selected or generated to indicate the user is exercising. This inferred status based on accelerometer sensor information may be confirmed compared with GPS readings to determine if the user is moving at a jogging pace or is located at a health facility.
  • the mobile device 301 includes a blood pressure or pulse rate sensor (not shown), information from such sensors may be checked to determine if sensor values are consistent with exercise.
  • sensors may enable distinguishing running or biking from traveling in a car, bus or train which could cause similar accelerometer and GPS information.
  • FIG. 3 shows various sensors 350 - 356 electrically coupled to the processor 391 of mobile device 301
  • a wireless transceiver such as WiFi transceiver 356
  • a short range wireless transceiver (SRWTx) 357 may be incorporated to communicate with a variety of external sensors.
  • the short range wireless communication transceiver 357 may be a Bluetooth protocol transceiver, Zigbee protocol transceiver, or other technology/protocol transceiver.
  • FIG. 4 illustrates basic process steps employed in various embodiments.
  • the processor 391 of the mobile device 301 may poll some or all of the sensors (e.g., 350 - 355 ) connected to the processor 391 , step 401 .
  • the sensors may include external sensors that are wirelessly connected to the processor 391 via mid- to long-range wireless transceiver 356 and/or a short range wireless transceiver 357 included within the mobile device 301 .
  • the sensors may also be coupled to the processor 391 by various available ports or custom interface circuits.
  • the processor 391 may include multiple processors (i.e., processor 391 may be a multi-processor chip) with some sensors coupled to or interfacing with a processor and other sensors coupled to or interfacing with a second processor within the multiprocessor chip.
  • the processor 391 may access a data register where sensor data is buffered, or send a signal to the sensor (or sensor interface circuitry) directing it to take a reading and wait to receive sensor data. If the sensor is external to the mobile device 301 , the processor 391 may send a data request message to the sensor via a wired or wireless data connection and then wait to receive sensor data in a response message via the same data link.
  • the processor 391 may simply receive data transmitted by each of the sensors (e.g., 350 - 355 ) without prompting.
  • the processor 391 may also retrieve calendar data stored in memory 392 , step 402 .
  • calendar data may be used to infer the activity and location of the user of the mobile device 301 .
  • Calendar data may be obtained for the particular time of day and day of week.
  • the processor 391 may also obtain the current time and date that may be stored in a data register.
  • the processor 391 may retrieve various mobile device settings, step 403 .
  • Such device settings may include the selected ringer type (e.g., silent or audible), theme settings, normal or roaming communication mode, battery power level, current cellular communications status, such as whether the user is engaged in a phone call or accessing the Internet on a mobile browser, current wireless communications status, such as whether the user is presently connected to a WiFi network, and current local area wireless network status, such as whether the user is presently using a Bluetooth device.
  • Each of these mobile device settings and operating conditions may provide information regarding the user's status. For example, if the user has set the mobile device 301 to display a casual theme (e.g., whimsical wallpaper, musical ringer, etc.) this may indicate that the user is not engaged in business or work activities.
  • the step of gathering sensor data may be performed after the step of gathering mobile device setting and calendar data.
  • the information provided by the device settings or indicated in the user's calendar may be used to make an initial inference regarding the user's activity which may be confirmed or further refined by selected sensor data.
  • the mobile device processor 391 may be configured to poll only the GPS sensor, since the user's status (talking on a cell phone) is already established, and only the location remains to be determined.
  • the processor 391 may be configured with software to poll only those sensors necessary to confirm that status.
  • a processor can infer a user's current status and determine which one of a group of avatars to display, step 404 .
  • a variety of processors may be used to make this determination in the various embodiments.
  • the processor 391 of the mobile device 301 is configured with software instructions to select an avatar for display based upon criteria stored in its memory 392 .
  • the avatar selected for display by the mobile device 301 may be uploaded to another computing device 113 - 117 .
  • the selected avatar may be sent to another mobile computing device as an attachment to an e-mail or SMS message.
  • the mobile device can send a URL or other network address to another computing device to enable the receiving computer to obtain the avatar by accessing the provided URL or other address.
  • the mobile computing device 301 may transmit a memory pointer indicating that the memory storage location of the selected avatar file to another computing device so that the receiving computing device can obtain the avatar from its own memory.
  • This embodiment may be employed where the avatar file is stored in memory (e.g., hard disc memory) of a server 109 , thereby enabling the server to load the selected avatar to the user's webpage.
  • This embodiment may also be employed with other computing devices 113 - 117 .
  • the mobile computing device 301 may transmit the sensor, calendar, and settings data to another computing device, such as server 109 or other computing devices 113 - 117 , so that the avatar determination, step 404 , can be performed by the processor of the receiving computing device.
  • the sensor, calendar, and settings data are received and stored in memory before the computing device's processor makes the avatar determination.
  • the avatar file can be made available for display on a computing device 113 - 117 , step 405 .
  • the computing device 113 - 117 displays the selected avatar by accessing the avatar file that was either pre-stored in memory (e.g., using an address or memory pointer communicated by the mobile device 301 ) or downloaded from either the mobile device 101 - 104 or a server 109 .
  • the avatar may be accessed and displayed as part of an Internet webpage hosted by a server 109 .
  • the avatar files may be updated annually or some other period of time such that the avatar reflects the age of the user.
  • the various avatar files may be updated to display an older and more mature avatar.
  • the avatar files may depict the user with graying hair or weight loss or gain as is appropriate with the actual appearance of the user. In this manner, when the appropriate avatar files is accessed or retrieved, the avatar will accurately reflect the user's age.
  • FIG. 5 is a process flow diagram of a first embodiment method in which the avatar to be displayed is determined by the mobile device 301 and is constantly updated to reflect the user's current status.
  • the mobile device 301 processor 391 periodically polls each of the sensors (e.g., 350 - 356 ) associated with the mobile device 301 , step 401 .
  • the processor 391 may perform a loop in which each of the various sensors is polled, the associated sensor data received and stored in an appropriate data record within the mobile devices memory 392 .
  • the processor 391 also checks the calendar data stored in the memory 392 , step 402 .
  • the calendar data may indicate where the user is expected to be and the particular activity the user is expected to be engaged in at the current time.
  • the processor 391 also checks the various settings of the mobile device, step 403 , including the current theme. As noted above, the order in which the processor 391 obtains data is not necessarily important and can vary among implementations.
  • the processor 391 stores all of the sensor, calendar, and settings data in a parameter value table, step 409 .
  • the parameter value table will be discussed in more detail below with reference to FIG. 6 a.
  • the processor 391 can evaluate the data stored in the parameter data table, step 410 , to determine which avatar to display, step 411 .
  • a variety of method embodiments may be used to evaluate the stored parameter data and select a particular avatar for display.
  • the parameters stored in the data table are compared to values stored in a selection table, such as illustrated in FIG. 6 b .
  • a user may program his/her mobile device to select particular avatar by entering associated selection criteria in such a selection table. In this matter, any user can easily configure the mobile device to preferences and settings.
  • the processor 391 determines the avatar to display, step 411 , by selecting the avatar for which the greatest number of selection criteria are satisfied by the parameters stored in the parameter data table.
  • the parameter data may be evaluated in a logic tree programmed in software.
  • the processor 391 executes a software routine in which the particular steps performed (e.g., a series of “if X, then Y” logic tests) depend upon certain parameter values. While the use of a programmed logic tree routine may operate faster, this embodiment may be more difficult for a user to configure and may allow fewer user selection options.
  • the processor 391 can direct the transmitter 398 to transmit the selected avatar to a server 109 via a wireless or cellular network, and the Internet 108 , step 415 .
  • the processor 391 may transmit the selected avatar file, a pointer or address to memory containing the selected avatar file, or an identifier of the selected avatar that the server 109 can use to locate the corresponding avatar file within its memory.
  • the processor 391 may periodically repeat the steps of obtaining parameter values so as to continually update the displayed avatar. If the user has generated appropriate avatars and configured the mobile device to appropriately link individual avatars to various parameter values, this embodiment can enable the server 109 to display an avatar that reflects the user's current status. In an embodiment, the processor 391 may optionally pause for a pre-determined amount of time before repeating the steps of obtaining parameter values in order to reduce the demand on the processor 391 , step 450 .
  • the transmitted data indicating a particular avatar to display is received by the server 109 , step 420 .
  • the server processor (not shown separately) of the server 109 may include the selected avatar file in a webpage hosted on the server 109 for the user for public display, step 430 .
  • the access request is received by the server 109 , step 431 .
  • the server 109 transmits the webpage with the selected avatar file as an HTML file to the computing device 113 - 117 of the second user, step 432 .
  • the receiving computing device 113 - 117 then displays the selected avatar to the second user, step 441 .
  • this embodiment insures that whenever a second user accesses the first user's webpage, step 440 , an avatar reflecting the first user's current status is displayed, step 441 . Since the polling and analysis of sensor and settings data is performed autonomously, the user's avatar presented to others is maintained consistent with the user's current status without input by the user.
  • data from sensors e.g., 350 - 356
  • the user's calendar e.g., the user's calendar
  • mobile device settings data can be stored in a parameter value table 600 .
  • Such data may be stored in the form of absolute values (i.e., the raw sensor information), or in the form of processed information (i.e., interpreted sensor information). This distinction turns on the amount of processing of the information that is done before it is stored in the parameter value table 600 .
  • 6 a illustrates processed GPS sensor information stored in the parameter value table 600 in which the raw geographic fix coordinate values have been interpreted and compared to a user created table of locations that enables the mobile device 301 to recognize that the device is currently located at the “Track” and is moving at a rate of about 4 mph.
  • raw data from an ambient noise sensor 350 has been processed and the recognized characteristic of less than 30 dB has been stored in the parameter value table 600 .
  • ambient light sensor 351 data has been processed and a recognized characteristic of “bright” has been stored in the parameter value table 600 .
  • accelerometer 353 data has been processed and stored as a range of acceleration values (in g's) and a recognized characteristic of the acceleration (periodic).
  • Raw sensor data may also be stored directly in the parameter value table 600 .
  • temperature sensor data of 87 degrees F. is stored in the table.
  • the fact that nothing is stored in the user's calendar for the present time is stored in the parameter value table 600 as either no data or a symbol indicating that no data is present in the calendar database.
  • the particular wallpaper employed on the mobile device 301 and the ring tone setting (‘loud’) are stored in the parameter value table 600 .
  • processed data values stored in the parameter value table 600 illustrated in FIG. 6 a are for explanatory purposes only. As one of skill in the art would appreciate, sensor and setting data are more likely to be stored as digital data that can be interpreted by the processor 391 . For example, processed data recognized as being consistent with the particular range or value may be stored as a symbol or binary value.
  • this information can easily be compared to criteria stored in an avatar selection logic table 601 , an illustrative example of which is shown in FIG. 6 b.
  • An avatar selection logic table 601 may be stored in memory of the computing device which determines the appropriate avatar to display. Thus, if the mobile device 301 determines the avatar to be displayed, such as described above with reference to FIG. 5 , then the avatar selection logic table 601 will be stored in the memory of the computing device 301 . In contrast, if the avatar selection is made by a server 109 , then the avatar selection logic table 601 may be stored on the hard disk memory coupled to the server processor. Similarly, if another computing device 113 - 117 makes the avatar selection, then the avatar selection logic table 601 would be stored on that device. Additionally, the avatar selection logic table 601 may be stored on more than one computing device.
  • the avatar selection logic table 601 may be stored on the mobile device 301 and on a server 109 that hosts the user's avatar and webpage. Further, different versions of the avatar selection logic table 601 may be stored on different computing devices, enabling a user to vary the avatar selection criteria based upon the particular computer being used to make the selection. Thus, the avatar selection logic table 601 may be stored in the memory associated with each of these devices depending upon which embodiment is implemented. Additionally, the avatar selection logic table 601 may be transmitted from one device to another to enable the receiving computing device to make the avatar selection.
  • an avatar selection logic table 601 to perform the avatar selection provides greater user flexibility and control over the process.
  • the various embodiments are intended to provide users with a flexible and accurate means for presenting avatar's reflecting their personal preferences and activities. Thus, there is benefit in giving users fine control over the process used to select and display the avatars of their choice.
  • Use of a avatar selection logic table 601 also simplifies the user's setup process when many sensor and setting criteria are employed.
  • a native application may be provided on the mobile device to enable a user to change the avatar selection criteria or otherwise manually control the avatar presented at any given time.
  • Users may customize the avatar selection logic table 601 such that the avatar file selected for display is chosen based upon a variety of parameters. This customization may be accomplished during a user setup process. Also, avatar selection criteria can be selected and the avatar selection logic table 601 populated during a user setup process in which the user makes personal selections in order to customize the user's own avatar behavior. Such a setup process may be accomplished with the aid of an interactive menu application running on a computing device. Such an interactive menu application may include user tools for creating and modifying and storing avatars, as well as menus for programming the avatar selection logic table 601 . As part of the process for creating avatars, such a menu application may require the user to assign a name or descriptor of each avatar that can be saved in the avatar selection logic table 601 . Once an avatar is created, or after all avatars have been created, the menu application may then prompt the user to enter values or ranges to be used as criteria for selecting each avatar, and store the user's responses in the appropriate fields of the avatar selection logic table 601 .
  • FIG. 6 b shows a avatar selection logic table 601 in which a user has defined an avatar entitled “work”, stored as data record 610 .
  • a “work” avatar will show a graphic representation of the user engaged in the user's occupation.
  • selection criteria for the work avatar include: a location at the office and low velocity as recorded by a GPS sensor; low ambient noise; “bright” ambient light conditions; zero or low accelerometer readings (e.g., consistent with sitting or walking); and professional wallpaper settings (such as the company logo).
  • the work avatar selection criteria may not include values for some parameters which the user anticipates are not likely to help resolve the user's status for that particular avatar.
  • the user has decided that the ambient temperature, calendar and ring tone values provide no additional conference value for selecting the work avatar.
  • the avatar selection logic table 601 includes a “meeting” avatar stored as data record 611 .
  • the mobile device may be programmed with a software application to enable a user to populate the avatar selection logic table by performing an activity while recording sensor and setting data, and then identifying the avatar to associate with the recorded values.
  • a user that frequently jogs on a particular track may calibrate the avatar selection logic table by activating a calibration process while jogging at the track.
  • the mobile device 301 can record the sensor values and device settings during the activity, average the values, and store the average or range within the avatar selection logic table 601 . In this manner, the mobile device can record the GPS coordinates of the track, the speed range of the user while jogging, the ambient noise, light and temperature conditions, and the accelerometer readings while jogging.
  • sensor values may exhibit characteristic patterns during particular activities that may be recognized and recorded in a calibration process.
  • an accelerometer may be able to recognize when a user is jogging based upon periodic accelerations with values and periodicity consistent with foot falls.
  • the mobile device 301 may also record the device settings selected by the user during the activity.
  • FIG. 6 c illustrates an example embodiment calibration method suitable for completing an avatar selection logic table.
  • a user may select a particular avatar to be calibrated, step 610 .
  • This avatar may have already been created by the user and given a name. Alternatively, the user may enter a name for an avatar yet to be created.
  • the user then begins the activity and initiates the calibration, such as by pressing a particular key on the mobile handset, step 612 .
  • the mobile device 301 records this sensor data and device settings, step 614 . For some sensors, this may involve recording sensor readings over a period of time along with the time of each recording in order to be able to recognize time-based patterns.
  • the user may end the calibration, such as by a pressing a particular key on the mobile device, step 616 .
  • the calibration may proceed for a preset amount of time, so that step 616 occurs automatically.
  • the processor 391 of the mobile device 301 can analyze the recorded sensor data using well-known statistical processes.
  • the sensor data may be statistically analyzed to determine the average sensor value and the standard deviation of sensor values. This calculation may be used to provide a mean with range (i.e., ⁇ ) value characterizing the particular activity.
  • the sensor data may be analyzed to determine the maximum and minimum value, thereby determining the actual range of measurements during the activity. This analysis may be appropriate particularly for GPS coordinate values in order to determine the boundaries of the activity (e.g., perimeter of a jogging track).
  • Sensor data may also be analyzed over time to determine characteristics of the values, such as whether accelerometer readings vary periodically, as may be the case while jogging or walking, or randomly, as may be the case in other activities. More sophisticated analysis of data may be employed as well, such as processing recorded ambient noise to detect and record particular noise patterns, such as the sound of an espresso machine.
  • the conclusions of the analyzed sensor data and mobile device settings may be stored in the avatar selection logic table 601 in a data record including the avatar name, step 620 .
  • the avatar selection logic table 601 may be stored in memory 392 of the mobile device 301 .
  • the user may repeat the process of selecting an avatar for calibration, engaging in the associated activity while allowing the mobile device 301 to perform a calibration routine and storing the analyzed sensor data in the avatar selection logic table 601 until criteria have been saved for all of the user's avatars.
  • the table may be transmitted to another computing device, such as a server 109 on which the user's avatar is hosted, step 622 .
  • Users may repeat the process illustrated in FIG. 6 c at any time to update the calibration or criteria for a particular avatar or to add selection criteria for a newly created avatar.
  • this self-calibration method simplifies the process of setting up the avatar selection criteria. Further, users may not recognize how various activities impact their mobile device, and thus this embodiment method enables avatars to be selected based on as many sensors and setting values as can be recorded, even those of which the user may not be aware. For example, a coffee shop may have a number of characteristic background noises, such as the sound of an espresso machine, which the user may not notice.
  • the avatar selection avatar selection logic table 601 shown in FIG. 6 b is intended for illustrative purposes only. More or fewer selection criteria may be included in the table depending upon the sensors and settings included in mobile device 301 .
  • the avatar selection table may also include fewer parameters if the user decides that the avatar to display can be determined using fewer parameters.
  • the specific parameters associated with each avatar in the avatar selection table are merely illustrative and will be altered according to each individual user's preferences.
  • a processor can compare each value to corresponding criteria in the avatar selection logic table 601 .
  • a variety of algorithms may be employed to determine which of the avatar's selection criteria are most closely satisfied by any of the values in the parameter value table 600 .
  • a simple sum of the number of satisfied criteria may be sufficient to determine the appropriate avatar to assign.
  • weighting factors may be applied to selected criteria so that some measured sensor values are given greater weight when selecting an avatar.
  • one or two criteria may be used to make a preliminary determination of the current status, followed by a comparison of parameter values against confirmatory criteria.
  • GPS and calendar data may be used as primary indicators of particular activities, with noise, light and accelerometer data used to confirm the activity indicated by GPS location or calendar entries.
  • the GPS values stored in the parameter value table 600 most closely matches the criteria for the running avatar, data record 618 .
  • the running avatar can then be confirmed as an appropriate selection by comparing the accelerometer data with the corresponding criteria in the avatar selection logic table 601 .
  • the accelerometer data distinguishes the activity from driving by or walking near the track.
  • the mobile device 301 may be configured with software instructions to ask the user whether it has correctly diagnosed the current activity (such as running), or ask the user to name the current activity. This method can enable the mobile device 301 to “learn” when parameters meet the desired criteria.
  • the avatar selection logic tables of other previous users may be used to populate a new avatar selection logic table for a new user. For example, if a number of previous users have assigned a “jogging” avatar to a set of sensor data which includes a particular GPS location (track), accelerometer readings, noise, light, etc.
  • an artificial intelligence routine running on the server may recommend to the new user the “jogging” avatar when the same or similar set of sensor data is generated during a calibration routine.
  • the artificial intelligence routine may analyze each of the hosted avatar selection logic tables to identify patterns or commonalities of assigned avatars and corresponding sensor data. By identifying these patterns, the server may recommend avatar based upon the sensor data gathered during a calibration process.
  • the GPS sensor location and velocity data will provide a good indication of the avatar to display.
  • multiple avatars may be associated with the same GPS location.
  • data records 610 and 611 in the avatar selection logic table 601 both includes GPS location criteria of the user's office. This ambiguity may be resolved by considering the ambient noise level, which if it exceeds 50 dB would indicate a meeting, or considering the ringtone setting, which if it is set to “silent” would also indicate a meeting, indicating that the “Meeting” avatar should be displayed instead of the “Work” avatar.
  • Alternative embodiments may ask the user (such as by way of a prompt presented on the display) the nature of an activity in order to learn and associate the recorded sensor and setting data with the specific activity.
  • the mobile device 301 may ask the user to identify a particular avatar to be associated with the present activity in the future. In this manner, the displayed avatar may more accurately represent the user's status.
  • FIG. 7 is a process flow diagram of an alternative embodiment which conserves processing power of the mobile device by including a decision step 405 which determines if any of the parameters stored in parameter value table 600 has changed. If the sensor values and device settings are the same as those already stored in the parameter value table 600 , in other words if no parameter value has changed, then no change in the avatar is required. Accordingly, the steps of selecting an avatar for display and transmitting the selection to a server (steps 410 - 415 ) do not need to be performed. Accordingly, if no parameter has changed in step 405 , then the processor can return to the process of polling sensors and checking settings, steps 401 - 409 .
  • steps 401 - 409 may be repeated after some predetermined delay, step 450 , to reduce the amount of processing time dedicated to monitoring sensors.
  • the processor determines that a value in the parameter value table 600 has changed, then the method can continue on to select a particular avatar for display and transmit the avatar to a server, steps 410 - 441 , in a manner similar to that described above with reference to FIG. 5 .
  • FIG. 8 is a process flow diagram of an embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings and selecting an avatar for display, steps 401 - 415 when prompted by the server 109 hosting the user's webpage.
  • the server 109 periodically transmits a request to the mobile device 301 requesting it to transmit the current avatar, step 455 .
  • the process of requesting an update from the mobile device 301 is referred to in the figures as a “ping” of the mobile device 301
  • the server 109 may send a request for the avatar to the mobile device 301 periodically or at pre-determined times in order to maintain on the user's website an avatar reflecting the current status of the user.
  • the processor 391 of the mobile device 301 conducts the process of polling sensors, step 401 , checking calendar data, step 402 , checking device settings, step 403 , storing the data in a primer value table, step 409 , selecting an avatar for display by comparing the parameter values to avatar selection, steps 410 , 412 , and transmitting the selected avatar to the server 109 , step 415 , all in a manner similar to that described above with reference to FIG. 5 .
  • step 455 the user may complete and change activities between avatar updates.
  • the displayed avatar may not always accurately represent the user's current status.
  • the currency of the displayed avatar can be enhanced by increasing the frequency of avatar update requests sent to the mobile device, step 455 , but at the cost of additional processing overhead.
  • mobile device 301 may reduce processing overhead.
  • a trade-off can be managed between currency and mobile device processor overhead.
  • FIG. 9 is a process flow diagram of another embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings and selecting an avatar for display, steps 401 - 415 , when the server 109 receives a request for the user's avatar.
  • the server 109 may send a request to access the user's webpage containing the avatar to the server 109 , step 440 .
  • the server 109 receives the request for the user's avatar, step 431 , the server 109 sends an avatar update request to the mobile device, step 455 .
  • the processor 391 of the mobile device 301 conducts the process of polling sensors, step 401 , checking calendar data, step 402 , checking device settings, step 403 , storing the data in a primer value table, step 409 , selecting an avatar for display by comparing the parameter values to avatar selection, steps 410 , 412 , and transmitting the selected avatar to the server 109 , step 415 , all in a manner similar to that described above with reference to FIG. 5 .
  • the server 109 can insert the avatar into the user's webpage, step 430 , and transmit the webpage including the avatar to the requester, step 432 , where it can be displayed by the requester's browser, step 441 .
  • the mobile device 301 only has to poll the sensors and retrieve calendar and device settings data when a second user wants to view the user's avatar. This minimizes the processing resources of the mobile device 109 dedicated to providing current status avatars. While the embodiment illustrated in FIG. 9 may result in a slight delay before the requester can view the avatar, the resulting avatar will accurately reflect the user's current status.
  • FIG. 10 is a process flow diagram of an embodiment in which the selection of the avatar is made by the server 109 which hosts the user's webpage.
  • the process of polling sensors, checking calendar data and recording device settings in any parameter value table, steps 401 - 409 are substantially the same as those described above with reference to FIG. 5 .
  • the mobile device transmits the parameter value table to the server 109 , step 416 .
  • the data is saved in the parameter value table 600 and transmitted to the server 109 in step 416 may be processed data or raw sensor data, depending upon implementation.
  • the processor 391 of the mobile device 301 may repeat the process of polling sensors, etc., steps 401 - 409 , so that the sensor and setting data reflecting the user's current status is transmitted periodically to the server 109 .
  • the mobile device processor 391 may pause, step 450 , before repeating polling and recording steps 401 - 404 .
  • the parameter value table 600 is received by the server 109 , step 417 , where the table may be stored in hard disk memory.
  • mobile device 301 may transmit sensor parameter values sequentially to the server 109 , which then can populate the parameter value table 600 as the data is received in step 417 .
  • the values may be compared to avatar selection criteria in the avatar selection logic table 601 , step 418 .
  • the process of comparing parameter values to the avatar selection criteria may be accomplished in the server 109 in a manner substantially similar to that described above with reference to step 411 and FIG. 5 .
  • the server 109 selects an appropriate avatar for display, step 419 , and prepares the avatar for display, such as by inserting the avatar into the user's webpage hosted by the server 109 .
  • an avatar reflecting the user's current status will be displayed when, in response to someone accessing the avatar, step 440 , the server 109 transmits the avatar, step 432 , for display on the requester's computing device, step 441 .
  • This embodiment obviates the need for the mobile device to select the appropriate avatar and transmit the avatar to the server, thereby reducing processor overhead and saving power. Because most mobile devices 301 are limited in their processing power and battery potential, offloading the avatar selection to the server 109 processor can enable the mobile device 301 operate longer on a single battery charge.
  • FIG. 11 illustrates an alternative embodiment in which the avatar selection is performed by a server 109 , but the mobile device 301 only transmits sensor, calendar and setting data if such data has changed.
  • the processing of this embodiment is substantially the same as described above with reference to FIG. 10 with the addition of a test, step 405 , to determine if any parameter values have changed since the last transmission of the parameter value table, step 416 . If no parameter values have changed, then the processor 391 of the mobile device 301 repeats the process of polling sensors etc., steps 401 - 409 , after an optional pause, step 450 . Only if a parameter value has changed does the mobile device transmit the updated parameter value table to the server 109 , step 416 .
  • This embodiment has the added advantage of transmitting the parameter value table only when an update to the values stored on the server 109 is required. Thus, this embodiment further saves on mobile device battery use.
  • FIG. 12 illustrates an alternative embodiment in which the avatar selection is conducted by the server 109 periodically requesting updates from the mobile device 301 .
  • the server 109 periodically requests an update of the parameter value table 600 , step 455 .
  • the mobile device 301 polls sensors, etc. and transmits the parameter value table 600 to the server 109 as described above with reference to steps 401 - 416 in FIG. 10 .
  • the rest of the steps in this embodiment are substantially the same as described above with reference to FIG. 10 .
  • This embodiment further reduces mobile device 301 battery consumption by limiting the polling of sensors, etc. to a periodicity that is controlled by the server 109 . As explained above with reference to FIG.
  • the trade-off between avatar currency and battery consumption can be controlled by varying periodicity of requests made to the mobile device 301 .
  • Alternative embodiments may also employ sensors which “sleep” and awaken when only when some parameter (e.g., noise, light, acceleration, change of location, etc.) is detected.
  • FIG. 13 illustrates another alternative embodiment in which the avatar selection is conducted by the server 109 in response to requests from others to view the avatar.
  • the server 109 receives the request, step 431 , and in response sends an avatar update request to the mobile device, step 455 .
  • the mobile device 301 polls sensors etc., and transmits the parameter value table 600 to the server 109 as described above with reference to steps 401 - 416 in FIGS. 10 and 12 .
  • the rest of the steps in this embodiment are substantially the same as described above with reference to like numbered steps in FIG. 10 .
  • This embodiment even further conserves battery power by limiting the polling of sensors and transmission of data to occasions when someone accesses the user's avatar.
  • FIG. 14 a An illustrative embodiment of such a method to display a user's avatar is shown in FIG. 14 a.
  • the processing within the mobile device 301 is substantially similar to that described above with reference to FIG. 5 up to the point where the avatar is selected, step 411 .
  • the mobile device 301 may continuously poll the sensors, step 401 , and checks calendar and settings data, steps 402 , 403 , to collect data regarding the user's current status.
  • the gathered data can be stored in a parameter value table 600 , step 409 , and compared against an avatar selection avatar selection logic table 601 , step 410 . Based upon the comparison, an avatar file to display is selected, step 411 . In this manner, the avatar selection is continuously updated so that the selected avatar reflects the user's current status.
  • an optional pause or delay, step 450 may be taken between polling cycles.
  • a request for the avatar may be sent by a second user, step 460 , by e-mail, SMS message and/or a phone call.
  • the mobile device 301 processor 391 recalls the avatar file from memory and transmits the file to the requester, step 462 .
  • the requesting computing device can then display the avatar, step 441 .
  • the processor 391 may compare the source (e.g., return address or telephone number) of the contact request against a list of pre-approved second user devices (e.g., 113 - 117 ) to determine if the requesting device is authorized to receive the user's avatar.
  • This list of preapproved users can be similar to a list of friends and family telephone numbers and e-mail addresses.
  • a plurality of the user's avatar files can be pre-stored on the device initiating contact, such as on all computing devices that are preapproved to receive the user's avatar.
  • the entire avatar file is transmitted to the requester, only the avatar name needs to be transmitted in order for any requester to recall the avatar file from its own memory.
  • pre-storing avatar files on pre-approved computing devices the delay between the avatar request and its presentation to the requester or is minimized since only the avatar name is transmitted.
  • such embodiments may require significant storage requirements on multiple devices as well as time required to download all of the user's avatar files to each preapproved computing device.
  • the avatar files may be pre-stored on the pre-approved computing device as well as directly transmitted to the pre-approved computing device.
  • only an identifier of the selected avatar file is transmitted to the second user's device.
  • the alternative embodiment checks the local memory of the requesting device to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • FIG. 14 b illustrates an alternative embodiment which is substantially the same as the embodiment described above with reference to FIG.
  • an avatar file identifier (ID) is transmitted to the requesting device in step 504 .
  • the requesting device receives the avatar file, step 463 , and displays it, step 441 .
  • the received avatar file may be stored in local memory for future use.
  • the alternative embodiment allows for the display of new or updated avatar files.
  • power consumption is conserved by both the mobile device 301 and the requesting device by only requiring the transmission of a large avatar file when the requesting device does not already have the avatar file in local memory.
  • the alternative embodiment also conserves transmission bandwidth.
  • FIG. 15 a illustrates an alternative to the foregoing embodiment shown in FIG. 14 a , which includes a test, step 405 , to determine if any a parameter has changed.
  • the processor 391 may continue to collect sensor, calendar and settings data, steps 401 - 409 . If a parameter has changed, then the method may continue as described above with reference to like numbered steps in FIG. 14 .
  • the embodiment shown in FIG. 15 b modifies the method shown in FIG. 15 a .
  • the alternative embodiment shown in FIG. 15 b conserves processing power, battery life and transmission bandwidth and further allows for the use of updated or new avatar files.
  • the embodiment shown in FIG. 15 b modifies the method shown in FIG. 15 a by further including steps 504 - 508 which transmit only an identifier of the selected avatar file to the second user's device.
  • the local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • FIG. 16 illustrates an alternative embodiment which further conserves processor and battery time of the mobile device 301 by responding to an avatar request made by a second user.
  • An avatar request may be transmitted to the mobile device 301 by a second user, step 460 .
  • the avatar request is received by the mobile device 301 , step 461 , and in response the processor 391 gathers and stores sensor, calendar and settings data, steps 401 - 409 , as described above with reference to FIG. 5 .
  • the processor 391 then compares the gathered data in a parameter value table 600 to an avatar selection logic table 601 , step 410 , to select an avatar for display, step 411 .
  • the processor 391 transmits the selected avatar file to the requesting computing device, step 415 , which receives the avatar file, step 462 , and displays the selected avatar, step 441 .
  • the processor 391 may compare the source (e.g., return address or telephone number) of the contact request against a list of pre-approved second user devices (e.g., 113 - 117 ) to determine if the requesting device is authorized to receive the user's avatar.
  • This list of preapproved users can be similar to a list of friends and family telephone numbers and e-mail addresses.
  • a plurality of the user's avatar files can be pre-stored on the computing device initiating requesting an avatar.
  • the user's avatar files may be stored on all computing devices that are preapproved to receive the user's avatar.
  • the entire avatar file is transmitted to the requester, only the avatar name needs to be transmitted in order for any requester to recall the avatar file from its own memory.
  • pre-storing avatar files on pre-approved computing devices the delay between the avatar request and its presentation to the requester is minimized since only the avatar name is transmitted.
  • such embodiments may require significant storage requirements on multiple devices as well as time required to download all of the user's avatar files to each preapproved computing device.
  • the embodiment shown in FIG. 16 b modifies the embodiment method shown in FIG. 16 a by adding steps 504 - 508 which transmit only an identifier of the selected avatar file to the second user's device.
  • the local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • the avatar files and avatar selection logic table 601 are pre-stored on computing devices that are authorized to request the user's avatar.
  • a preapproved computing device requests an avatar, step 460 , such as by calling, sending an e-mail, or sending an SMS message to the user.
  • the processor 391 of the mobile device 301 polls sensors, checks calendar data and checks device settings, steps 401 - 403 , and stores the data in a parameter value table 600 , step 409 , and as described more fully above with reference to FIG. 5 .
  • the processor 391 transmits the parameter value table 600 to the requesting computing device, step 416 .
  • the requesting computing device receives the parameter value table, step 462 , and compares the values to an avatar selection logic table 601 stored on the computing device, step 464 , in order to select the appropriate avatar for display, step 466 . Having selected the appropriate avatar, the computing device then calls the avatar from its memory, and displays it, step 441 .
  • the process by which the avatar-requesting computing device compares parameter values to avatar selection criteria, steps 464 and 466 are substantially the same as similar steps described above as performed by the mobile device 301 or server 109 .
  • the user may also set authorization selection criteria to control who may view each avatar, at what times, and during what activities.
  • a user may provide a plurality of avatars corresponding to the same sensor settings but differing depending upon the identity of a requester (i.e., the second user making the request). Some avatars may provide less detail regarding the user's exact activity or may differ significantly from the activity in which the user is actually engaged. Further, a user may set authorization controls to override sensor information to ensure the displayed avatar does not correspond to the user's actual activity.
  • a user may set authorization levels to ensure that the user's boss may only view an avatar detailing the user's activity during work hours. At other times the avatar may be denied or hidden, or if an avatar is present, it may be a simple avatar indicating that the user is busy without depicting the activity in which the user is engaged.
  • a user may set authorization levels (also referred to as permission controls) so that the user's boss may view an avatar depicting the user at work, when the sensor data indicates that the user is at the gym.
  • FIG. 18 illustrates example process steps that may be employed in embodiments which select an avatar based in part upon an authorization level of a requestor.
  • the mobile device processor 391 may retrieve various pieces of data regarding the user's current activity, steps 401 - 403 . Once relevant data regarding the user's current activity is collected, the authorization level of the requestor is determined, step 501 . Various methods to determine the requestor's authorization level are described in more detail below. Once the authorization level of the requestor is determined it may be used as another parameter to select the appropriate avatar to display. In embodiments where the avatar is sent directly from the mobile device 301 to the requestor's device, the mobile device's processor 391 may check the authorization level of the requestor. In instances where the avatar is stored and inserted into a webpage at a central server location, the processor of the server or the processor of the mobile device may perform the step of checking the authorization level of the requestor.
  • Any of a variety of methods may be implemented to check a requestor's authorization level. For example, a requestor may be asked to provide some form of authentication credential, such as a user name and password to verify that the requestor is authorized to receive an avatar and/or has a particular authorization level. Alternatively, some specific computing devices may be authorized to view selected avatars.
  • the processor may check a static internet protocol (IP) address of computing device submitting a request for the avatar to determine if the computing device is authorized to receive the avatar, such as by comparing the static IP address received in the avatar request message to a list of static IP addresses authorized to receive the avatar.
  • IP internet protocol
  • Any method which authenticates a requestor or a computing device transmitting a request to receive an avatar as an authorized user/device or categorizes a requestor or device into various authorization levels may be used in the various methods to perform the step of checking the authorization level of a requestor. Once the determination is made of whether the requestor is authorized to receive a particular avatar (or the level of the requestor's authorization), this criterion may be entered into the parameter table as another criterion to determine which avatar to display or transmit to the requestor.
  • a processor can infer a user's current status. Combining this inference with the requestor's authorization level, the processor may select an avatar to display in accordance with the user's preferences, step 404 . As discussed above with respect to the various embodiments illustrated in FIGS. 5-17 , a variety of processors may be used to make this determination. For example, the determination of which avatar to display may be made by the processor of the mobile device 301 , the server 109 , or the requestor's computing device. In addition, once the appropriate avatar has been selected, it may be displayed in any of the ways described above. For example, the avatar may be displayed as part of a webpage hosted by a server, displayed directly on the requestor's computing device, or attached as an image file sent in an e-mail, SMS or other message.
  • data from mobile device sensors can be stored in a parameter value table 602 .
  • Such data may be stored in the form of absolute values (i.e., the raw sensor information), or in the form of processed information (i.e., interpreted sensor information). This distinction turns on the amount of processing of the information that is done before it is stored in the parameter value table 602 .
  • the parameter table 602 may include an additional column for recording whether the requestor is authorized or the requestor's authorization level, if there are more than two levels.
  • FIG. 19 a illustrates a parameter value table 602 similar to the parameter value table described above with reference to FIG. 6 with the addition of a column titled “authorization level” storing the results of the authorization level check performed in step 501 . In the illustrated example the authorization level check has determined that the requestor is authorized.
  • FIG. 19 b An illustrative example of an avatar selection logic table 603 is shown in FIG. 19 b , which is similar to the selection logic table 601 described above with reference to FIG. 6 b .
  • the avatar selection logic table 603 of FIG. 19 b includes an additional parameter column which records selection criteria relating to the authorization level of the requestor.
  • the avatar selection logic table 603 shown in FIG. 19 b may be stored in memory of the computing device which determines the appropriate avatar to display.
  • using an avatar selection logic table 603 to perform the avatar selection provides users with flexibility and control over the selection process.
  • Users may customize the avatar selection logic table 603 such that the avatar selected for display is chosen based upon a variety of parameters including the authorization level of the requestor.
  • the avatar selection logic table 603 provides users with the additional option of assigning different avatars for display based on the authorization levels of the requestor. This customization of the avatar selection logic table 603 may be accomplished during a user setup process. Any of the setup methods discussed above for populating the avatar selection logic table 601 may be implemented to construct the avatar selection logic table 603 .
  • the authorization level may be simply a binary level denoting whether a requestor is authorized.
  • two different avatars may be displayed for identical selection criteria of sensor, calendar and settings data depending on whether the requestor is authorized.
  • a more general avatar may be displayed in instances where the requestor is not authorized while a detailed or more accurate avatar may be displayed for the identical selection criteria of sensor, calendar and setting data if the requestor is authorized.
  • the first user may simply elect to assign a value of “no avatar,” meaning that no avatar is to be displayed or transmitted if the requestor is not authorized.
  • a first user may set multiple levels of authorization, each resulting in the display of a different avatar for identical selection criteria of sensor, calendar and settings values.
  • selection criteria for both data records 650 and 651 include: a location at the office and low velocity as recorded by a GPS sensor; low ambient noise; “bright” ambient light conditions; zero or low accelerometer readings (e.g., consistent with sitting or walking); calendar data indicating that a meeting is scheduled and professional wallpaper settings (such as the company logo).
  • the user may want to indicate that the user is at work and not disclose the exact activity in which the user is currently engaged.
  • the calendar parameter may indicate that the user is currently in a “Board Meeting,” the avatar displayed to an unauthorized requester is the more generic “Work” avatar of the user engaged in the user's occupation.
  • authorization level stored in table 602 is “yes”
  • this may mean that the requestor is a co-worker, boss, or family member (for example) to which the user wants to disclose an accurate avatar.
  • the user may wish to accurately indicate the activity in which the user is engaged so that more information will be conveyed to a known requestor.
  • the “meeting” avatar may be displayed showing the user engaged in a meeting or presentation.
  • the selection criteria include: a location at home and low velocity as recorded by a GPS sensor; “dark” ambient light conditions; zero accelerometer readings (e.g., consistent with sitting or sleeping).
  • Both data records 652 and 653 may not include values for some parameters which the user anticipates are not likely to help resolve the user's status for that particular avatar. For example, the user has decided that the ambient temperature, calendar data, wallpaper and ring tone values provide no additional conference value for selecting either the avatar to display. Rather, the user may feel that if the user is at home, the user is likely sleeping. However, the user may not want to indicate to co-workers or more specifically the user's boss that the user is sleeping. Thus, only authorized requestors, such as friends and family members, will receive the “sleeping” avatar. Non-authorized requesters such as the user's boss will simply be receive a “busy” avatar according to the selection criteria in data record 653 .
  • a user may program the avatar selection logic table 603 with multiple levels of authorization such that more than two different avatars may be displayed for the identical sensor, calendar and settings data depending upon the authorization level of the requestor.
  • the selection criteria includes: a location at the golf course and low velocity as recorded by a GPS sensor; “Bright” ambient light conditions; zero to low accelerometer readings (e.g., consistent with walking or riding in a golf cart); ambient temperature being greater than 75 degrees; and a calendar setting indicating a weekday. In such circumstances, the user may not wish to inform either the user's spouse or boss that the user is golfing on a weekday during business hours.
  • a requester whose authorization level indicates that the requestor is on the user's buddy list will be sent the “golfing” avatar (see data record 655 ).
  • the user's spouse may have a authorization level of “family” which causes the “busy” avatar to be selected for display.
  • the user's boss may have an authorization level which would cause a “work” avatar to be selected for display.
  • FIG. 19 c illustrates an example embodiment calibration method suitable for completing an avatar selection logic table that includes a requestor's authorization level.
  • a user may select a particular avatar to calibrate, step 610 . This avatar may have already been created by the user and given a name. Alternatively, the user may enter a name for an avatar yet to be created.
  • the user then begins the activity and initiates the calibration, such as by pressing a particular key on the mobile handset, step 612 .
  • the mobile device 301 records sensor data and device settings, step 614 .
  • this may involve recording sensor readings over a period of time along with the time of each recording in order to be able to recognize time-based patterns.
  • the user may be permitted to take multiple number of sensor readings and average the sensor readings to refine the calibration of parameter readings for an avatar.
  • avatars may be displayed to indicate an increasing or decreasing effort on the part of a user during an exercise session depending on the refined heart rate sensor readings.
  • the user may end the calibration, such as by a pressing a particular key on the mobile device, step 616 .
  • the calibration may proceed for a preset amount of time, so that step 616 occurs automatically.
  • step 614 the user is given the option of adding an authorization level to the data record, step 615 .
  • the user may be prompted to select alternative avatars for the recorded sensor and setting data.
  • the user may be further prompted to select an authorization level corresponding the selected alternative avatar such that requestor's possessing the selected authorization level will receive the selected alternative avatar whenever the sensor and settings match the recorded criteria.
  • the processor 391 of the mobile device 301 can analyze the recorded sensor data using well-known statistical processes as described more fully above with reference to FIG. 6 c.
  • the conclusions of the analyzed sensor data and mobile device settings may be stored in the avatar selection logic table 603 in a data record including the authorization level and avatar name, step 620 .
  • the avatar selection logic table 603 may be stored in memory 392 of the mobile device 301 .
  • the user may repeat the process of selecting an avatar for calibration, engaging in the associated activity while allowing the mobile device 301 to perform a calibration routine and storing the analyzed sensor data in the avatar selection logic table 603 until criteria have been saved for all of the user's avatars. This may include multiple avatar settings for different authorization levels.
  • the table may be transmitted to another computing device, such as a server 109 on which the user's avatar is hosted, step 622 .
  • a learning method as described above with respect to avatar selection logic table 601 may be implemented.
  • Users may repeat the process illustrated in FIG. 19 c at any time to update the calibration or criteria for a particular avatar or to add selection criteria for a newly created avatar.
  • FIG. 20 is a process flow diagram of an embodiment method in which the avatar to be displayed is determined based upon the sensor and setting data as well as the requestor's authorization level.
  • the mobile device 301 processor 391 performs steps 401 - 431 in a manner substantially the same as described above with reference to FIGS. 5-12 .
  • the processor of the server 109 can implement any of a number of methods discussed previously to check the authorization level of the person or device requesting access to the webpage(the “second user”), step 501 .
  • Data regarding the second user may be sent from the second user's device back to the server 109 processor to complete the check authorization level step, step 503 .
  • this information is stored in the parameter table held in memory of the server 109 .
  • the data record of the parameter table is compared against the criteria in an avatar selection logic table 603 stored in memory of the server 109 to determine which avatar to display, step 411 .
  • a variety of method embodiments may be used to evaluate the stored parameter data and select a particular avatar for display.
  • the processor of the server 109 may select the avatar the avatar for which the greatest number of selection criteria are satisfied by the parameters stored in the parameter data table.
  • the processor of server 109 can insert the selected avatar into the webpage and sent to the requestor as described more fully above with reference to FIGS. 5-12 for steps 430 - 441 . Since the polling and analysis of sensor and settings data is performed autonomously, the user's avatar presented to others is maintained consistent with the user's current status without input by the user unless the user has elected to alter the avatar to be selected based upon the requestor's authorization level.
  • FIG. 21 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which conserves battery and processor time and determines an avatar to display based upon an avatar selection logic table which includes data related to the authorization level of the second user.
  • the embodiment illustrated in FIG. 21 includes steps 401 - 440 described above with reference to FIGS. 5-12 .
  • the processor selects an avatar to display in the same manner as described above with reference to FIG. 20 for steps 501 - 503 and 411 .
  • the processor of server 109 can insert the selected avatar into the webpage and sent to the requestor as described more fully above with reference to FIGS. 5-12 for steps 430 - 441 .
  • FIG. 22 is a process flow diagram of an embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings, steps 401 - 409 , when prompted by the server 109 hosting the user's webpage, step 455 .
  • This embodiment includes the steps 401 - 455 described above with reference to FIGS. 5-12 .
  • the processor receives and checks the requestor's authorization level, steps 501 - 503 , and selects the appropriate avatar, step 411 , as described above with reference to FIGS. 20 and 21 . Once the appropriate avatar has been selected, the processor of server 109 can insert the selected avatar into the webpage and sent to the requestor as described more fully above with reference to FIGS. 5-12 for steps 430 - 441 .
  • FIG. 23 is a process flow diagram of another embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings and selecting an avatar for display when the server 109 receives a request for the user's avatar.
  • the embodiment shown in FIG. 23 operates much in the same manner as the embodiment illustrated in FIG. 9 with the addition of checking the authorization level of the requester, steps 501 , 503 , described above with reference to FIGS. 20 and 21 .
  • the authorization level determined in steps 501 and 503 is transmitted to the mobile device 301 for storage into the parameter table 602 , step 409 as described above with reference to FIG. 5-9 .
  • the avatar may be selected based upon the authorization level of the second user. Once the appropriate avatar has been selected, the avatar can be sent to the requester as described more fully above with reference to FIG. 9 for steps 415 - 441
  • FIG. 24 a is a process flow diagram of another embodiment method suitable for displaying an avatar selected based upon sensor and setting data as well as the authorization level of a second user directly on the requesting device.
  • the embodiment illustrated in FIG. 24 a operates in the same manner as described above with reference to FIG. 14 a with the addition of checking the authorization level of the second user, steps 501 and 503 and storing the authorization level data in the parameter table 602 , step 502 , described above with reference to FIGS. 20 and 21 .
  • the complete parameter table 602 can be compared against the avatar selection logic table 603 to select and display the appropriate avatar as described above with reference to FIGS. 14 a , 20 and 21 for steps 411 , 432 , 463 , 442 .
  • the embodiment shown in FIG. 24 b modifies the embodiment method shown in FIG. 24 a by adding steps 504 - 508 which transmit only an identifier of the selected avatar file to the second user's device.
  • the local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • the alternative embodiment shown in FIG. 24 b conserves processing power, battery life and transmission bandwidth and further allows for the use of updated or new avatar files.
  • the embodiment illustrated in FIG. 25 a conserves processor and battery time of the mobile device 301 by responding to an avatar request made by a second user.
  • the embodiment illustrated in FIG. 25 a operates in the same manner as the embodiment described above with reference to FIG. 16 a with the addition of checking and sending the authorization level of the second user, steps 501 and 503 described above with reference to FIGS. 20 and 21 .
  • the authorization level data is stored in the parameter table 602 along with the various sensor and settings data, step 409 . Once stored, the complete parameter table 602 can be compared against the avatar selection logic table 603 , step 410 , to select the appropriate avatar to display, step 411 . Once selected, the processor 391 then transmits the selected avatar file to the requesting computing device, step 415 , which receives the avatar file, step 462 , and displays the selected avatar, step 441 as described above with reference to FIGS. 14-16 .
  • the embodiment shown in FIG. 25 b modifies the embodiment method shown in FIG. 25 a by adding steps 504 - 508 which transmit only an identifier of the selected avatar file to the second user's device.
  • the local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • the alternative embodiment shown in FIG. 25 b conserves processing power, battery life and transmission bandwidth and further allows for the use of updated or new avatar files.
  • the embodiment illustrated in FIG. 26 pre-stores the avatar files and avatar selection avatar selection logic table 603 on computing devices that are authorized to request the user's avatar. While each of the requesting computing devices are previously authorized to request the user's avatar, some computing devices may be authorized to view only certain avatars in response to various sensor and settings data.
  • the embodiment illustrated in FIG. 26 operates in the same manner as described above with reference to FIG. 17 with the addition of checking and sending the authorization level of the second user, steps 501 and 503 , as described above with reference to FIGS. 20 and 21 .
  • the authorization level data is stored in the parameter table 602 along with the various sensor and settings data, step 409 .
  • the complete parameter table 602 can be transmitted to the second user's requesting computing device, step 416 , and compared against the avatar selection logic table 603 to select and display the appropriate avatar as described above with reference to FIG. 17 for steps 462 , 464 , 466 , 441 .
  • artificial intelligence routines may be implemented on the mobile device to prompt users to select an avatar to display when repeated parameter patterns are recognized. For example, if a mobile device continuously polls the GPS sensor 354 and an artificial intelligence application notices that the device is at the same GPS location coordinates between 9 am and 5 pm during weekdays, the processor may prompt the user to identify the name of the location, and suggest a name of “Work”.
  • the processor 391 may prompt the user to identify the user's current activity as “exercise.”
  • the processor 391 may prompt the user to identify the user's current activity as “Running.”
  • the number of avatar files stored in memory is limited, such as when the avatar files are stored directly on the computing device that will display them, fewer and more generic avatars may be desired. In such instances, fewer parameters may be needed to accurately reflect a user's current status. Conversely, if avatar files are stored on a computing device with greater storage and processing capabilities, such as server 109 , the number of avatar files may be increased, as well as the level of precision in matching an avatar with the user's current status.
  • parameter data may cause an avatar to change consistent with changes in the user's activity.
  • varying avatars may be displayed in response to changes in parameter data. For example, if a user is running as inferred by the GPS sensor 354 measuring a velocity of about 6 miles per hour and an accelerometer 353 indicating a periodicity of accelerations consistent with running, the avatar selected for display may be an animated image of a runner. As the GPS sensor 354 records increased, the avatar selected for display may show the running image moving faster, and/or showing the increased effort by displaying an avatar that is running, sweating and breathing harder. To include such additional avatars, including animated avatars, simply requires including another line in the avatar selection logic table 601 , 603 linking the increased speed as a requirement to display a new avatar file.
  • a first user can provide a second user with an accurate representation of the first user's current activity.
  • the displayed avatars could dynamically change as the user changes status.
  • a user may pro-actively change his/her avatar that is displayed to other members of the internet forum.
  • the avatar will only change if the user proactively changes the file selected for display.
  • other members of the internet forum may observe the user's avatar change automatically as the user changes activity or status. In this way, the user no longer has to actively alter the avatar to be displayed in order to reflect his or her status.
  • similar applications can be implemented in instant messaging, text messaging, or even regular phone call situations.
  • the various embodiments provide a number of new applications for mobile devices. Such applications include improving communications with colleagues, monitoring activities of children, broadening participation in games, and medical monitoring.
  • an avatar can quickly communicate information regarding a user since “a picture is worth a thousand words.” Users may select avatars and program the avatar selection criteria so that colleagues can quickly determine their status prior to sending an e-mail or making a telephone call. For example, if a colleague has access to a user's avatar, a quick view of the avatar prior to sending an e-mail or placing a telephone call will inform the colleague whether the user is involved in some activity that will preclude a prompt reply, such as out for a run, in a meeting, or on vacation.
  • Access links to avatars may be incorporated into address books so that if an individual has been given access rights to a user's avatar the individual can check on the user's status prior to or as part of sending an e-mail or placing a telephone call.
  • the user of an avatar can proactively inform selected colleagues of the user's status. For example, by posting an avatar showing the user is in a meeting (or on travel or vacation), those who may want to contact the user will be informed that a call will not be answered and e-mail may not be promptly read.
  • parents may be able to keep track of children when they are out of their sight.
  • children wearing a mobile device configured according to one or more of the embodiments can be tracked by parents accessing a website that displays avatars of their children including their location and present activity.
  • children involved in a game of “hide ‘n’ seek” may be monitored by their parents viewing a map of the play area which includes avatars indicating the location and movement/status of each child.
  • the displayed avatar may be more closely linked to the movements and activities of the user's real world movements and activities.
  • the various embodiments may enable a user to be involved in a game of paintball while spectators watch the paintball match in a virtual world representation.
  • Each participant can be equipped with a mobile device 301 including a suite of motion, position and location sensors, each reporting sensor data in near-real time to a central server 109 .
  • position sensors may be outfitted on users' limbs and coupled to the mobile device by a wireless data link (e.g., Bluetooth) to provide data on the posture and movements of participants, such as the direction in which an individual is facing or aiming a paintball gun.
  • a wireless data link e.g., Bluetooth
  • An avatar representing each paintball participant can be generated in the virtual world representation of the match, with each user's avatar changing location and activity (i.e., running, sitting, hiding) based on the mobile device 301 sensor data which are sampled in real time.
  • the virtual world avatar representations may therefore, accurately mimic the movement and activity of the real world users carrying the mobile devices 301 .
  • medical sensors on the mobile device 301 or connected to a processor by wireless data links can report their data (e.g., through the mobile device 301 ) to a system that uses such information to select an avatar that reflects the patient's current status.
  • the processor need not be mobile, and instead may be associated with a facility, such as an emergency room or hospital information system.
  • Sensor data associated with patients can be received from a variety of medical sensors coupled to each patient, such as blood pressure, pulse, EKG, and EEG sensors for example.
  • Avatar selection criteria associated with each of the sensors may be used to select an avatar that reflects a patient's medical needs or condition.
  • a processor can select an avatar consisting of the patient's photograph with a red background, and display that avatar on a nursing station.
  • the use of such an avatar can more efficiently communicate critical information than text (e.g., the patient's name and the medical data) presented on the screen.
  • a pacemaker may be configured to transmit information regarding the condition of the device or the patient's heart to a mobile device, such as by means of a Near Field Communications data link, which can relay the data to a server accessible by the patient's doctor. That server can use the patient's pacemaker data to select an appropriate avatar to efficiently communicate the patient's status to the doctor.
  • the hardware used to implement the foregoing embodiments may be processing elements and memory elements configured to execute a set of instructions, wherein the set of instructions are for performing method steps corresponding to the above methods.
  • some steps or methods may be performed by circuitry that is specific to a given function.
  • the steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two.
  • the software module may reside in a processor readable storage medium and/or processor readable memory both of which may be any of RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other tangible form of data storage medium known in the art.
  • the processor readable memory may comprise more than one memory chip, memory internal to the processor chip, in separate memory chips, and combinations of different types of memory such as flash memory and RAM memory.
  • references herein to the memory of a mobile handset are intended to encompass any one or all memory modules within the mobile handset without limitation to a particular configuration, type or packaging.
  • An exemplary storage medium is coupled to a processor in either the mobile handset or the theme server such that the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integral to the processor.
  • the processor and the storage medium may reside in an ASIC.

Abstract

A cellular or wireless mobile device includes a one or more sensors and a processor configured with software to receive data from the one or more sensors, calendar data and device settings, compare sensor, calendar, device settings data, and an authorization level of a requesting user to avatar selection criteria, and select an avatar based upon the comparison. By correlating sensor data, calendar data and device settings to a user's current status, the avatar selection criteria enables a processor to automatically select an avatar that reflects the user's current status. Others then can be informed of the user's current status by accessing the user's avatar.

Description

    FIELD OF THE INVENTION
  • The present invention relates generally to providing a current indication of a user's status or activity via a computer generated avatar.
  • BACKGROUND
  • In the computing sense, an avatar is a virtual representation of a computer user. The term “avatar” can also refer to the personality connected with a screen name, or handle, of an Internet user. Avatars are often used to represent the real world user in the virtual world of computing. Avatars can be three-dimensional models used in virtual reality applications and computer games. Avatars can also be a two-dimensional icon (picture) used in Internet forums and other online communities, instant messaging, gaming and non-gaming applications. Avatars may be animated or static.
  • The term avatar dates at least as far back as 1985, when it was used as the name for the player character in a series of computer games. Recently, the usage of avatars has spread in popularity and avatars are now often used in Internet forums. Avatars on Internet forums serve the purpose of representing users and their actions, personalizing their contributions to the forum, and may represent different parts of their persona, beliefs, interests or social status in the forum.
  • The traditional avatar system used on most Internet forums is a small (96×96 to 100×100 pixels, for example) square-shaped area close to the user's forum post, where the avatar is placed. Some forums allow the user to upload an avatar image that may have been designed by the user or acquired from elsewhere. Other forums allow the user to select an avatar from a preset list or use an auto-discovery algorithm to extract one from the user's homepage.
  • In the instant messaging (IM) context, avatars, sometimes referred to as buddy icons, are usually small images. For example, IM icons are 48×48 pixels, although many icons can be found online that typically measure anywhere from 50×50 pixels to 100×100 pixels in size. A wide variety of these imaged avatars can be found on web sites and popular eGroups such as Yahoo! Groups. The latest use of avatars in instant messaging is dominated by dynamic avatars. The user chooses an avatar that represents him while chatting and, through the use of text to speech technology, enables the avatar to talk the text being used at the chat window. Another form of use for this kind of avatar is for video chats/calls. Some services, such as Skype (through some external plug-ins) allow users to use talking avatars during video calls, replacing the image from the user's camera with an animated, talking avatar.
  • SUMMARY
  • Various embodiment systems and methods are disclosed which automatically update a user's virtual world avatar to provide a more accurate representation of the user's current real world status or activity. Embodiments may receive information from a variety of sensors located either within the user's mobile device or within close proximity to the mobile device to provide some parameters of the user's real world environment. The variety of sensors may include, but are not limited to a location sensor (e.g., GPS coordinates), a microphone for sensing ambient noise, a camera or light sensor for sensing ambient light, accelerometers, temperature sensor, and bio-physiological sensors such as a breathalyzer, heart rate monitor, pulse sensor, EEG, ECG, EKG, and/or blood pressure sensor. In addition, embodiments may utilize a user's calendar data as well as mobile device settings to generate an updated virtual representation via an avatar of the user's real world status or activity. Alternative embodiments may age the user's avatar over time so that a user's avatar grows older, more mature as the user grows older, more mature. Various embodiments automatically update or change the user's avatar as the user goes about his/her daily activities. Other embodiments update or change the user's avatar when a request to view the avatar is made. The user's avatar may be viewed in a singular location, such as a webpage. Alternative embodiments may allow a user's avatar to be downloaded to any requesting party. Still other embodiments may pro-actively inform selected parties of a user's current real world status or activity by sending an avatar.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments of the invention, and, together with the general description given above and the detailed description given below, serve to explain features of the invention.
  • FIG. 1 illustrates exemplary avatars suitable for use with the various embodiments.
  • FIG. 2 is system block diagram of a system suitable for use with the various embodiments.
  • FIG. 3 is a system block diagram of a mobile device suitable for use with the various embodiments.
  • FIG. 4 is a process flow diagram of an embodiment method suitable for implementation on the system.
  • FIG. 5 is a process flow diagram of a specific embodiment method suitable for implementation on a mobile handset.
  • FIG. 6 a is an example parameter data table suitable for storing a variety of sensor data, user calendar data, and mobile device settings indicating the current status of the user.
  • FIG. 6 b is an illustrative avatar selection logic table which indicates an avatar to display based on various parameters.
  • FIG. 6 c is a process flow diagram of an embodiment method for calibrating an avatar selection logic table.
  • FIG. 7 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which conserves battery and processor time.
  • FIG. 8 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which responds to a server request.
  • FIG. 9 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which responds to a second user request.
  • FIG. 10 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server.
  • FIG. 11 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server which conserves battery and processor time.
  • FIG. 12 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server which conserves battery and processor time by responding to a server request.
  • FIG. 13 is a process flow diagram of another embodiment method wherein avatar selection is offloaded to a server which conserves battery and processor time by responding to a second user request.
  • FIG. 14 a is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device.
  • FIG. 14 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar directly on the requesting device
  • FIG. 15 a is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device which conserves battery and processor time.
  • FIG. 15 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar directly on the requesting device which conserves battery and processor time.
  • FIG. 16 a is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device which conserves battery and processing time by responding to a second user request.
  • FIG. 16 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar directly on the requesting device which conserves battery and processing time by responding to a second user request.
  • FIG. 17 is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device wherein avatar selection is offloaded to the requesting user's device.
  • FIG. 18 is a process flow diagram of an alternative embodiment method suitable for implementation on the system.
  • FIG. 19 a is an example parameter data table suitable for storing a variety of sensor data, user calendar data, mobile device settings and authorization level of a user requesting an avatar.
  • FIG. 19 b is an illustrative avatar selection logic table which indicates an avatar to display based on various parameters including the authorization level of the requesting user.
  • FIG. 19 c is a process flow diagram of an embodiment method for calibrating an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 20 is a process flow diagram of an embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 21 is a process flow diagram of another embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 22 is a process flow diagram of another embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 23 is a process flow diagram of another embodiment method for selecting an avatar for display based upon an avatar selection logic table including the authorization level of the requesting user.
  • FIG. 24 a is a process flow diagram of another embodiment method suitable for displaying an avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 24 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 25 a is a process flow diagram of another embodiment method suitable for displaying an avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 25 b is a process flow diagram of another embodiment method suitable for displaying a new or updated avatar selected based upon sensor and setting data and the authorization level of a second user directly on the requesting device.
  • FIG. 26 is a process flow diagram of another embodiment method suitable for displaying an avatar directly on the requesting device based upon sensor and settings data and the second user's authorization level.
  • DETAILED DESCRIPTION
  • The various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.
  • As used herein, the term mobile device may refer to any one or all of cellular telephones, personal data assistants (PDA's), palm-top computers, laptop computers, wireless electronic mail receivers (e.g., the Blackberry® and Treo® devices), multimedia Internet enabled cellular telephones (e.g., the iPhone® ), and similar personal electronic devices which include a programmable processor and memory. In a preferred embodiment, the mobile device is a cellular handset that can communicate via a cellular telephone network (e.g., a cellphone). However, cellular telephone communication capability is not necessary in all embodiments. Moreover, wireless data communication may be achieved by the mobile device connecting to a wireless data network (e.g., a WiFi network) instead of a cellular telephone network.
  • As used herein, the term “server” refers to any of a variety of commercially available computer systems configured to operate in a client-server architecture. In particular, the term “server” refers to network servers, particularly Internet accessible servers, which typically include a processor, memory (e.g., hard disk memory), and network interface circuitry configured to connect the server processor to the network, such as the Internet.
  • As used herein, the term “theme” refers to the collection of user-configurable settings that may be implemented on a mobile handset to personalize the mobile handset to the user's preferences. A theme is defined by the files and settings used for any and all of the wallpaper (i.e., images presented on the mobile handset display), ring tones (i.e., audio files triggered by different events), ring settings (e.g., loud, medium, soft and silent, as well as vibrate mode), button tones (i.e., tones played when buttons are pressed), button functions, display icons, and speed dial settings (i.e., the telephone number associated with each button configured for speed dialing). A theme may also include settings for other user-configurable settings like password protections, keypad locks, carrier selections, etc. Composed of such data, a theme can be stored as a mix of files (e.g., image and audio files), as well as configuration data (e.g., telephone numbers associated with particular speed dial buttons).
  • With the advent of modern computing and mobile communications, individuals are able to communicate with one another in a variety of ways and at all times. In the past, if an individual wanted to communicate with another, communication could be done through face to face conversations, letters, or the telephone. Today, in addition to these conventional means of communications, individuals may communicate with one another via e-mail, SMS, instant messaging, voice over internet protocol (VoIP) calls, video over internet protocol calls, internet forum chats, and telephone communications via mobile device (handset) calls. With so many different channels of communications, individuals expect to be able to contact others whenever they desire. However, some individuals may desire to not be disturbed. For example, an individual may be conducting an important meeting and does not want his mobile device to ring during the meeting. While he may simply turn off his mobile device (or the ringer) he may also wish to inform any callers of the reason he is temporarily unavailable. With mobile communications so ubiquitous, many users expect to be able to contact their intended call recipient at all times. Thus, when an intended recipient does not answer an email, SMS, phone call, etc., the initiating caller is often left wondering why the intended recipient is not responding.
  • Avatars have gained increasing popularity in use as graphical representations of an individual. An avatar can be text (such as a screen name) or a two or three-dimensional graphical representation (e.g., a photograph, cartoon or machine-generated image). Avatars can be static images or dynamic (animated) images. Examples of some avatars are illustrated in FIG. 1. As shown in FIG. 1, avatars can be images which graphically communicate information about the associated individual, such as professions, hobbies, current activities and moods. Historically, users have used an avatar as a representation of the user or the user's persona in online gaming or internet forum chats. Avatars can efficiently convey information about the user, such as the user's interests, just by nature of the avatar. Conventionally, a user would select an avatar to display a representation of the user participating in an online game, internet forum chat, SMS chat, etc. In this way the selected avatar file may be used to represent the real world user in the virtual world of computing or electronic telecommunications. The various embodiments disclosed herein enable users to generate or post an avatar that more closely represents in a virtual world the real world status of the user.
  • Mobile devices, particularly cellular telephones, are practically ubiquitous and indispensible. Consequently, mobile devices can be ideal platforms for housing sensors that can measure the environment and activities of user. By properly analyzing motion, sound, location and other sensed information obtained from sensors mounted on user's mobile devices, computer systems can infer user activities, information that can be used in the various embodiments to update users' avatars to reflect their real world activities. For example, a user's mobile device may “learn” that the user's current GPS location is a conference room in the office. Accordingly, the mobile device may automatically set the mobile device to vibrate mode and also automatically update the user's avatar to depict “do not disturb.”
  • By including the data from a variety of sensors, the various embodiments incorporate or make use of a variety of sensors housed in a user's mobile device, and use the sensor information to update or generate avatars which can be displayed to reflect the user's status, location, mood and/or activity. The various embodiments may employ a variety of sensors and access schedule or calendar information maintained within the mobile device to more accurately reflect the user's status. Such an avatar may be made available for public or private viewing, in order to quickly inform viewers of the user's current status, location, mood and/or activity. Such avatars may be sent to others proactively, such as appended to or included within an SMS or e-mail message, or posted to a server where others can access or download the avatar, such as by accessing an on-line game or website where the avatar is maintained. Avatars may be changed according to users' status on a pre-scheduled basis (e.g., periodic updating), whenever the user's status is requested (e.g., in response to a request for an avatar), or whenever the user's status changes (e.g., when sensors indicate the user's location, mood and/or activity have changed.
  • FIG. 2 is a system block diagram of a computing and telecommunications system suitable for use with various embodiments disclosed herein. The system block diagram illustrates an exemplary cellular network system, but is not intended to contemplate all possible configurations. As shown in FIG. 2, a variety of users, engaged in a variety of activities may have on their person a variety of mobile devices. For example, a first user may be using a laptop computer 101 in a coffee shop. A second user may be shopping at an airport carrying a PDA 102. A third user may be driving a car with a cell phone 103. A fourth user may be conducting a meeting and carrying a cell phone 104. Each of the mobile devices 101, 102, 103, and 104 may communicate via wireless networks with wireless communication base stations 105. The wireless communication base stations 105 may be a cell phone tower, Bluetooth receiver, WiFi receiver, or any other wireless transceiver. In a preferred embodiment, the devices communicate via cellular telephone and data networks with a cellular base station antenna 105. The wireless communication base stations 105 may be coupled to a router 106 and wireless communication server 107 to connect to the Internet 108. Alternative paths to the Internet 108 may involve more or less communication equipment. For example, some wireless service providers may require additional servers to provide their users with access to the Internet 108. Alternatively, the wireless communication base station may permit a more direct link to the Internet 108.
  • Users of mobile devices may communicate with one another or with any other user connected through the Internet 108. For example, a first user may send an email from his laptop 101 to a user at desktop computer 113, a user of a PDA 114, a user of a laptop computer 115, or other users via their cell phones 116, 117. In such a case the user would send an email from the laptop 101 which would be wirelessly transmitted to the base station 105. The email would be sent via a router 106 to a server 107, across the Internet 108 to a server 110 servicing the intended recipient's computing device to a router 111 where it might be sent via a wired connection to a desktop computer 112 or via a wireless base station 112 to a mobile device 114-117. Similarly, the recipient can reply or initiate communications to the user in the reverse manner.
  • Mobile device users may be unavailable to respond to incoming messages from time to time and may wish to provide some indication of their current status to explain why they are non-responsive. Alternatively, mobile device users may want to inform others as to their current status so that others can know if they are available to communicate. Further, some users may wish to inform their friends and family of their current status and activities as part of their social networking lifestyle. Such notification of a user's status may be accomplished efficiently using an avatar that can be accessed by or presented to selected individuals.
  • Such an avatar may be maintained and displayed, for example, on the user's social networking webpage (e.g., myspace.com, facebook.com, etc) or any other webpage maintained on an Internet accessible server. The avatar, along with the contents of the webpage and data contained therein, may be stored in the memory of a server 109. The server 109 is connected to the Internet 108 and may be accessed by devices with Internet 108 capabilities and proper access rights.
  • In an embodiment, users may access a person's avatar, such as by accessing a webpage to display the current status avatar, prior to communicating (e.g., calling, sending an e-mail or sending an SMS message). In another embodiment, when a user attempts to communicate with a person, the user may be automatically directed to the webpage containing the current status avatar if the person does not respond or if the person has selected a “do not disturb” option. In another embodiment, an avatar file may be automatically and directly sent back to a user's device 113-117 for display. In another embodiment, avatar files may be proactively sent to a pre-approved list of recipients whenever a user's status changes, or on regularly scheduled or predetermined intervals. In another embodiment, a link to a user's webpage including an avatar may be sent to a pre-approved list of recipients whenever a user's status changes, or on regularly scheduled or predetermined intervals. As one of skill in the art would appreciate, server 109 may not be necessary if avatar files are being sent directly to a user's device 113-117. In addition, avatars may be hosted on a variety of servers 107, 110, and need not be limited to a particular server 109.
  • In addition to automatically updating avatars based upon sensed and recorded information, an embodiment enables users to override the automatic updating in order to select a particular avatar regardless of the user's current status. Thus, users may elect to have their avatars reflect their current status at some times, while selecting particular avatars at other times. In alternative embodiments, the user may also set permission or authorization levels for avatars to control who may view a particular avatar, at what times, and during what activities. For example, a user may elect to enable the user's supervisor to view a particular avatar only during work hours. Outside work hours the avatar may be hidden, not transmitted, or reflect a general status, such as busy. In such embodiments, users can set the information and avatars that are public. Such embodiments may be used in updating an avatar on a website or in a general broadcast.
  • Information from sensors within a user's mobile device can be used to determine how an avatar should be updated. Each of the mobile devices 101-104 may include a variety of sensors which can provide information used to determine the respective users' status. Examples of various sensors will be described in more detail below. In addition, the day, time, and mobile device settings may be considered to provide further information regarding the respective users' status. Further, the mobile device's own operating status may provide information on the user's status.
  • For example, if a user regularly visits a coffee shop and uses the shops' wireless WiFi network this status could be determined based upon (1) the time of day and day of the week (TOD/DOW), particularly if the breaks are regular or scheduled, (2) activation of the WiFi transceiver, perhaps in combination with TOD/DOW information, (3) GPS (or other location sensor) coordinates, perhaps in combination with TOD/DOW information and activation of the WiFi transceiver, (4) background noise picked up by the device's microphone (e.g., the unique sound of an espresso machine), perhaps in combination with TOD/DOW information, activation of the WiFi transceiver, and GPS coordinate information. Other sensors may also confirm the user's status is consistent with a coffee break, including accelerometers (indicating little if any motion) and temperature (indicating an ambient temperature consistent with an indoor location). If the user skipped his/her coffee break, was home sick or on vacation, an avatar display based solely on TOD/DOW information would inaccurately depict the user's current status. By further referencing the mobile device's GPS sensor, the system can determine if the user is within (or close to) the coffee shop location. Using background noise sensing, the system may confirm a coffee break status recognizing espresso machine noise. Having made this determination of the user's current status, a system (whether the user's mobile device 101 or a server 109 receiving information from the device 101) can select or generate an avatar that graphically depicts this status. In addition, if the device's microphone detected a noise level that exceeded a predetermined limit, the avatar could be altered to reflect the user's ambient environment. This may suggest to someone viewing the avatar that a non-verbal means of communication (e.g., text message) may be the best form of communication if someone wanted to contact the user.
  • As another example, background noise may be monitored (e.g., using the mobile device's microphone) for music and other sounds that may be used to infer the user's mood. For example, if the background noise includes music with added up-tempo beat, an avatar expressing a happy mood may be selected. As this example illustrates, by increasing the number of sensors used and the variety of information considered, a system can better infer the user's current status.
  • As another example, a user status of flying in an airplane may be determined by comparing TOD/DOW information to the user's Calendar information. If the user is scheduled to be on an airplane flight and there is no contrary information (e.g., an open WiFi connection with the user's mobile device 102), a system (whether the user's mobile device 101 or a server 109 receiving information from the device 101) can select or generate an avatar that graphically depicts this status. If airlines begin allowing mobile devices to communicate during flights, the mobile device 102 may also report sensor information consistent with airline travel to permit confirmation of the status. For example, constantly changing GPS data from the mobile device 102, accelerometer data and/or barometric pressure data from a sensor on the mobile device 102 may all be used to confirm airline travel.
  • As another example, a user driving or riding in a car may be determined based upon TOD/DOW (e.g., the time and day being consistent with rush hour) in combination with GPS location information and accelerometer sensor readings. Constantly changing GPS data and accelerometer data from the mobile device 103 may be used to confirm that the user's status as driving.
  • As another example, the status of a user of a cell phone 104 in a business meeting may be inferred from TOD/DOW information compared against the user's Calendar. The TOD/DOW and calendar information may be maintained within the cell phone 104 and/or a server 109. Such a preliminary determination can be confirmed by considering other sensor information, such as GPS location information, and accelerometer and temperature sensor readings, and the cell phone 104 operating settings. For example, if the user has switched his mobile device 104 to vibrate or silent ring, these settings are consistent with the user being in a meeting and help to confirm the user's status. Similarly, if the GPS location is consistent with a conference room, the accelerometer readings show little significant movement and the temperature is consistent with an indoor location, this information can be used to confirm that the user is in a meeting. With the user's status so inferred, an appropriate avatar file can be selected or generated.
  • As another example, a user of a cell phone 104 may choose to display a set avatar while on vacation (e.g., an avatar sleeping in a hammock) instead of displaying current status. In this example, the cell phone 104 may be configured to not relay sensor data or communicate a set avatar, or a server 109 may be configured to ignore sensor data and TOD/DOW information and display a selected avatar.
  • In using a variety of sensors and information sources to infer a user's status, certain sensors or parameters may be given higher priority, particularly with respect to each other. For example, GPS location information may override TOD/DOW plus calendar information, such as when the GPS location is different from the location indicated by the calendar information. Logic rules may be generated to deal with such contradictory information. Additionally, user inputs and settings regarding current status may override all sensor, phone settings, or calendar parameter data.
  • FIG. 3 illustrates a component block diagram of an embodiment of a mobile device suitable for use in the overview system. A typical mobile device 301 may include a microprocessor 391, a memory 392, an antenna 394, a display 393, an alphanumeric keypad 396, a 4-way menu selector rocker switch 397, a speaker 388, a microphone 389, a vocoder 399, a receiver 395, a transmitter 398, (together a cellular network transceiver) and various circuits, busses and electrical interconnections among these components. In addition, the mobile device 301 may include an ambient noise sensor 350 which is connected to the microphone 389 to detect ambient noise levels. The ambient noise sensor 350 may also function as a microphone for speaker phone operations. The mobile device 301 may also include a camera 351 which in addition to taking pictures can be used to measure ambient light levels. The mobile device may also contain an ambient temperature sensor 352 and one or more accelerometers 353 detecting the relative acceleration of the mobile device 301. The mobile device may also include a GPS receiver circuit 354 which is configured to receive signals from GPS satellites to determine the precise global position of the mobile device 301. The mobile device may also include a breathalyzer sensor 355 which is configured to measure a blood alcohol content (BAC) from a user's exhaled breath. Other biometric sensors may also be included, such as, for example, a blood pressure monitor, pulse rate sensor, EEG, ECG, EKG, etc. In embodiments where EEG sensors are included, a user's avatar may, for example, be displayed to be concentrating if the EEG sensor indicates brainwave patterns consistent with concentration levels. Alternatively, the user's avatar may be displayed to indicate the user in a relaxed mental state if the EEG sensor indicates brainwave patterns consistent with relaxation levels.
  • Each of the mobile device 301 sensors 350-356 are connected to the processor 391, which is in turn connected to an internal memory unit 392. In this manner, the processor 391 may collect parameter data from the various sensors 350-356 and may store the data in memory unit 392 or transmit the data via transmitter 398. It should be noted that while mobile device 301 is depicted in FIG. 3 as a mobile handset or cell phone, the system blocks may be found in any mobile device with wireless communication capability. In this way, other mobile devices such as a laptop computer, PDA or similar devices may be represented.
  • The various sensors illustrated in FIG. 3 can be used to more accurately infer a user's status and activities. For example, if the breathalyzer sensor 355 determines that the user's BAC is above 0.1%, indicating the user may be alcohol-impaired, the user's avatar could be selected or generated to indicate the impairment. As another example, if the accelerometer senses rhythmic motion consistent with jogging, an avatar may be selected or generated to indicate the user is exercising. This inferred status based on accelerometer sensor information may be confirmed compared with GPS readings to determine if the user is moving at a jogging pace or is located at a health facility. Also, if the mobile device 301 includes a blood pressure or pulse rate sensor (not shown), information from such sensors may be checked to determine if sensor values are consistent with exercise. Such sensors may enable distinguishing running or biking from traveling in a car, bus or train which could cause similar accelerometer and GPS information.
  • While FIG. 3 shows various sensors 350-356 electrically coupled to the processor 391 of mobile device 301, a wireless transceiver, such as WiFi transceiver 356, and a short range wireless transceiver (SRWTx) 357 may be incorporated to communicate with a variety of external sensors. One of skill in the art would appreciate the short range wireless communication transceiver 357 may be a Bluetooth protocol transceiver, Zigbee protocol transceiver, or other technology/protocol transceiver.
  • FIG. 4 illustrates basic process steps employed in various embodiments. As a first step, the processor 391 of the mobile device 301 may poll some or all of the sensors (e.g., 350-355) connected to the processor 391, step 401. As discussed above, the sensors may include external sensors that are wirelessly connected to the processor 391 via mid- to long-range wireless transceiver 356 and/or a short range wireless transceiver 357 included within the mobile device 301. The sensors may also be coupled to the processor 391 by various available ports or custom interface circuits. Further, the processor 391 may include multiple processors (i.e., processor 391 may be a multi-processor chip) with some sensors coupled to or interfacing with a processor and other sensors coupled to or interfacing with a second processor within the multiprocessor chip. In polling sensors, the processor 391 may access a data register where sensor data is buffered, or send a signal to the sensor (or sensor interface circuitry) directing it to take a reading and wait to receive sensor data. If the sensor is external to the mobile device 301, the processor 391 may send a data request message to the sensor via a wired or wireless data connection and then wait to receive sensor data in a response message via the same data link. Alternatively, the processor 391 may simply receive data transmitted by each of the sensors (e.g., 350-355) without prompting.
  • After, while, or before receiving sensor data, the processor 391 may also retrieve calendar data stored in memory 392, step 402. As discussed above, calendar data may be used to infer the activity and location of the user of the mobile device 301. Calendar data may be obtained for the particular time of day and day of week. As part of this step, the processor 391 may also obtain the current time and date that may be stored in a data register.
  • After, while, or before receiving sensor and calendar data, the processor 391 may retrieve various mobile device settings, step 403. Such device settings may include the selected ringer type (e.g., silent or audible), theme settings, normal or roaming communication mode, battery power level, current cellular communications status, such as whether the user is engaged in a phone call or accessing the Internet on a mobile browser, current wireless communications status, such as whether the user is presently connected to a WiFi network, and current local area wireless network status, such as whether the user is presently using a Bluetooth device. Each of these mobile device settings and operating conditions may provide information regarding the user's status. For example, if the user has set the mobile device 301 to display a casual theme (e.g., whimsical wallpaper, musical ringer, etc.) this may indicate that the user is not engaged in business or work activities.
  • In each of the various embodiments, the order of the method steps described herein may be varied from that illustrated in the figures. For example, the step of gathering sensor data may be performed after the step of gathering mobile device setting and calendar data. In this manner, the information provided by the device settings or indicated in the user's calendar may be used to make an initial inference regarding the user's activity which may be confirmed or further refined by selected sensor data. For example, if the device settings indicate that any cellular telephone call is in process, the mobile device processor 391 may be configured to poll only the GPS sensor, since the user's status (talking on a cell phone) is already established, and only the location remains to be determined. As another example, if the calendar data indicates that the user is in a meeting, the processor 391 may be configured with software to poll only those sensors necessary to confirm that status.
  • Using information gathered from mobile device sensors and stored data and settings, a processor can infer a user's current status and determine which one of a group of avatars to display, step 404. A variety of processors may be used to make this determination in the various embodiments. In one embodiment the processor 391 of the mobile device 301 is configured with software instructions to select an avatar for display based upon criteria stored in its memory 392. In such an embodiment, the avatar selected for display by the mobile device 301 may be uploaded to another computing device 113-117. For example, in an embodiment the selected avatar may be sent to another mobile computing device as an attachment to an e-mail or SMS message. In another embodiment, the mobile device can send a URL or other network address to another computing device to enable the receiving computer to obtain the avatar by accessing the provided URL or other address. In another embodiment, the mobile computing device 301 may transmit a memory pointer indicating that the memory storage location of the selected avatar file to another computing device so that the receiving computing device can obtain the avatar from its own memory. This embodiment may be employed where the avatar file is stored in memory (e.g., hard disc memory) of a server 109, thereby enabling the server to load the selected avatar to the user's webpage. This embodiment may also be employed with other computing devices 113-117.
  • In other embodiments, the mobile computing device 301 may transmit the sensor, calendar, and settings data to another computing device, such as server 109 or other computing devices 113-117, so that the avatar determination, step 404, can be performed by the processor of the receiving computing device. In such embodiments, the sensor, calendar, and settings data are received and stored in memory before the computing device's processor makes the avatar determination.
  • Once the proper avatar to display has been determined, the avatar file can be made available for display on a computing device 113-117, step 405. In an embodiment, the computing device 113-117 displays the selected avatar by accessing the avatar file that was either pre-stored in memory (e.g., using an address or memory pointer communicated by the mobile device 301) or downloaded from either the mobile device 101-104 or a server 109. In another embodiment, the avatar may be accessed and displayed as part of an Internet webpage hosted by a server 109. In alternative embodiments, the avatar files may be updated annually or some other period of time such that the avatar reflects the age of the user. As the user matures and grows older, the various avatar files may be updated to display an older and more mature avatar. For example, the avatar files may depict the user with graying hair or weight loss or gain as is appropriate with the actual appearance of the user. In this manner, when the appropriate avatar files is accessed or retrieved, the avatar will accurately reflect the user's age.
  • Specific process flow steps of the various embodiments will now be described in greater detail with reference to FIGS. 5-26.
  • FIG. 5 is a process flow diagram of a first embodiment method in which the avatar to be displayed is determined by the mobile device 301 and is constantly updated to reflect the user's current status. In this embodiment, the mobile device 301 processor 391 periodically polls each of the sensors (e.g., 350-356) associated with the mobile device 301, step 401. In this step, the processor 391 may perform a loop in which each of the various sensors is polled, the associated sensor data received and stored in an appropriate data record within the mobile devices memory 392. The processor 391 also checks the calendar data stored in the memory 392, step 402. The calendar data may indicate where the user is expected to be and the particular activity the user is expected to be engaged in at the current time. The processor 391 also checks the various settings of the mobile device, step 403, including the current theme. As noted above, the order in which the processor 391 obtains data is not necessarily important and can vary among implementations. The processor 391 stores all of the sensor, calendar, and settings data in a parameter value table, step 409. The parameter value table will be discussed in more detail below with reference to FIG. 6 a.
  • The processor 391 can evaluate the data stored in the parameter data table, step 410, to determine which avatar to display, step 411. A variety of method embodiments may be used to evaluate the stored parameter data and select a particular avatar for display. In a preferred embodiment, the parameters stored in the data table are compared to values stored in a selection table, such as illustrated in FIG. 6 b. As described below in more detail with reference to that figure, a user may program his/her mobile device to select particular avatar by entering associated selection criteria in such a selection table. In this matter, any user can easily configure the mobile device to preferences and settings. In this embodiment, the processor 391 determines the avatar to display, step 411, by selecting the avatar for which the greatest number of selection criteria are satisfied by the parameters stored in the parameter data table.
  • In another embodiment, the parameter data may be evaluated in a logic tree programmed in software. In this example embodiment, the processor 391 executes a software routine in which the particular steps performed (e.g., a series of “if X, then Y” logic tests) depend upon certain parameter values. While the use of a programmed logic tree routine may operate faster, this embodiment may be more difficult for a user to configure and may allow fewer user selection options.
  • Once the appropriate avatar has been selected, the processor 391 can direct the transmitter 398 to transmit the selected avatar to a server 109 via a wireless or cellular network, and the Internet 108, step 415. In this step, the processor 391 may transmit the selected avatar file, a pointer or address to memory containing the selected avatar file, or an identifier of the selected avatar that the server 109 can use to locate the corresponding avatar file within its memory.
  • Once the selected avatar has been transmitted to the server 109, the processor 391 may periodically repeat the steps of obtaining parameter values so as to continually update the displayed avatar. If the user has generated appropriate avatars and configured the mobile device to appropriately link individual avatars to various parameter values, this embodiment can enable the server 109 to display an avatar that reflects the user's current status. In an embodiment, the processor 391 may optionally pause for a pre-determined amount of time before repeating the steps of obtaining parameter values in order to reduce the demand on the processor 391, step 450.
  • The transmitted data indicating a particular avatar to display is received by the server 109, step 420. The server processor (not shown separately) of the server 109 may include the selected avatar file in a webpage hosted on the server 109 for the user for public display, step 430. When a second user accesses the user's webpage, step 440, the access request is received by the server 109, step 431. In response, the server 109 transmits the webpage with the selected avatar file as an HTML file to the computing device 113-117 of the second user, step 432. The receiving computing device 113-117 then displays the selected avatar to the second user, step 441. Since the mobile device processor 391 is continually updating the avatar selection, this embodiment insures that whenever a second user accesses the first user's webpage, step 440, an avatar reflecting the first user's current status is displayed, step 441. Since the polling and analysis of sensor and settings data is performed autonomously, the user's avatar presented to others is maintained consistent with the user's current status without input by the user.
  • A variety of data structures may be used with the various embodiments, an example of which is displayed in FIG. 6 a. As illustrated, data from sensors (e.g., 350-356), the user's calendar, and mobile device settings data can be stored in a parameter value table 600. Such data may be stored in the form of absolute values (i.e., the raw sensor information), or in the form of processed information (i.e., interpreted sensor information). This distinction turns on the amount of processing of the information that is done before it is stored in the parameter value table 600. For example, FIG. 6 a illustrates processed GPS sensor information stored in the parameter value table 600 in which the raw geographic fix coordinate values have been interpreted and compared to a user created table of locations that enables the mobile device 301 to recognize that the device is currently located at the “Track” and is moving at a rate of about 4 mph. Similarly, raw data from an ambient noise sensor 350 has been processed and the recognized characteristic of less than 30 dB has been stored in the parameter value table 600. Similarly, ambient light sensor 351 data has been processed and a recognized characteristic of “bright” has been stored in the parameter value table 600. Similarly, accelerometer 353 data has been processed and stored as a range of acceleration values (in g's) and a recognized characteristic of the acceleration (periodic). Raw sensor data may also be stored directly in the parameter value table 600. For example, temperature sensor data of 87 degrees F. is stored in the table. Similarly, the fact that nothing is stored in the user's calendar for the present time is stored in the parameter value table 600 as either no data or a symbol indicating that no data is present in the calendar database. Similarly, the particular wallpaper employed on the mobile device 301 and the ring tone setting (‘loud’) are stored in the parameter value table 600.
  • The processed data values stored in the parameter value table 600 illustrated in FIG. 6 a are for explanatory purposes only. As one of skill in the art would appreciate, sensor and setting data are more likely to be stored as digital data that can be interpreted by the processor 391. For example, processed data recognized as being consistent with the particular range or value may be stored as a symbol or binary value.
  • With sensor and setting data stored in a parameter value table 600, this information can easily be compared to criteria stored in an avatar selection logic table 601, an illustrative example of which is shown in FIG. 6 b.
  • An avatar selection logic table 601 may be stored in memory of the computing device which determines the appropriate avatar to display. Thus, if the mobile device 301 determines the avatar to be displayed, such as described above with reference to FIG. 5, then the avatar selection logic table 601 will be stored in the memory of the computing device 301. In contrast, if the avatar selection is made by a server 109, then the avatar selection logic table 601 may be stored on the hard disk memory coupled to the server processor. Similarly, if another computing device 113-117 makes the avatar selection, then the avatar selection logic table 601 would be stored on that device. Additionally, the avatar selection logic table 601 may be stored on more than one computing device. For example, the avatar selection logic table 601 may be stored on the mobile device 301 and on a server 109 that hosts the user's avatar and webpage. Further, different versions of the avatar selection logic table 601 may be stored on different computing devices, enabling a user to vary the avatar selection criteria based upon the particular computer being used to make the selection. Thus, the avatar selection logic table 601 may be stored in the memory associated with each of these devices depending upon which embodiment is implemented. Additionally, the avatar selection logic table 601 may be transmitted from one device to another to enable the receiving computing device to make the avatar selection.
  • Using an avatar selection logic table 601 to perform the avatar selection provides greater user flexibility and control over the process. The various embodiments are intended to provide users with a flexible and accurate means for presenting avatar's reflecting their personal preferences and activities. Thus, there is benefit in giving users fine control over the process used to select and display the avatars of their choice. Use of a avatar selection logic table 601 also simplifies the user's setup process when many sensor and setting criteria are employed. A native application may be provided on the mobile device to enable a user to change the avatar selection criteria or otherwise manually control the avatar presented at any given time.
  • Users may customize the avatar selection logic table 601 such that the avatar file selected for display is chosen based upon a variety of parameters. This customization may be accomplished during a user setup process. Also, avatar selection criteria can be selected and the avatar selection logic table 601 populated during a user setup process in which the user makes personal selections in order to customize the user's own avatar behavior. Such a setup process may be accomplished with the aid of an interactive menu application running on a computing device. Such an interactive menu application may include user tools for creating and modifying and storing avatars, as well as menus for programming the avatar selection logic table 601. As part of the process for creating avatars, such a menu application may require the user to assign a name or descriptor of each avatar that can be saved in the avatar selection logic table 601. Once an avatar is created, or after all avatars have been created, the menu application may then prompt the user to enter values or ranges to be used as criteria for selecting each avatar, and store the user's responses in the appropriate fields of the avatar selection logic table 601.
  • For example, FIG. 6 b shows a avatar selection logic table 601 in which a user has defined an avatar entitled “work”, stored as data record 610. For most users a “work” avatar will show a graphic representation of the user engaged in the user's occupation. In this example, selection criteria for the work avatar include: a location at the office and low velocity as recorded by a GPS sensor; low ambient noise; “bright” ambient light conditions; zero or low accelerometer readings (e.g., consistent with sitting or walking); and professional wallpaper settings (such as the company logo). The work avatar selection criteria may not include values for some parameters which the user anticipates are not likely to help resolve the user's status for that particular avatar. For example, the user has decided that the ambient temperature, calendar and ring tone values provide no additional conference value for selecting the work avatar. As a second example, the avatar selection logic table 601 includes a “meeting” avatar stored as data record 611. For this avatar, which is similar to the work avatar, the user has set ambient noise criteria at greater than 50 dB and added a ring tone=“silence” criterion.
  • Alternatively, the mobile device may be programmed with a software application to enable a user to populate the avatar selection logic table by performing an activity while recording sensor and setting data, and then identifying the avatar to associate with the recorded values. For example, a user that frequently jogs on a particular track may calibrate the avatar selection logic table by activating a calibration process while jogging at the track. In a calibration process, the mobile device 301 can record the sensor values and device settings during the activity, average the values, and store the average or range within the avatar selection logic table 601. In this manner, the mobile device can record the GPS coordinates of the track, the speed range of the user while jogging, the ambient noise, light and temperature conditions, and the accelerometer readings while jogging. In particular, some sensor values may exhibit characteristic patterns during particular activities that may be recognized and recorded in a calibration process. For example, an accelerometer may be able to recognize when a user is jogging based upon periodic accelerations with values and periodicity consistent with foot falls. The mobile device 301 may also record the device settings selected by the user during the activity.
  • FIG. 6 c illustrates an example embodiment calibration method suitable for completing an avatar selection logic table. To begin the process, a user may select a particular avatar to be calibrated, step 610. This avatar may have already been created by the user and given a name. Alternatively, the user may enter a name for an avatar yet to be created. The user then begins the activity and initiates the calibration, such as by pressing a particular key on the mobile handset, step 612. During the activity, the mobile device 301 records this sensor data and device settings, step 614. For some sensors, this may involve recording sensor readings over a period of time along with the time of each recording in order to be able to recognize time-based patterns. After a period of time, the user may end the calibration, such as by a pressing a particular key on the mobile device, step 616. Alternatively, the calibration may proceed for a preset amount of time, so that step 616 occurs automatically.
  • Once calibration data gathering is completed, the processor 391 of the mobile device 301 can analyze the recorded sensor data using well-known statistical processes. For example, the sensor data may be statistically analyzed to determine the average sensor value and the standard deviation of sensor values. This calculation may be used to provide a mean with range (i.e., ±) value characterizing the particular activity. Alternatively, the sensor data may be analyzed to determine the maximum and minimum value, thereby determining the actual range of measurements during the activity. This analysis may be appropriate particularly for GPS coordinate values in order to determine the boundaries of the activity (e.g., perimeter of a jogging track). Sensor data may also be analyzed over time to determine characteristics of the values, such as whether accelerometer readings vary periodically, as may be the case while jogging or walking, or randomly, as may be the case in other activities. More sophisticated analysis of data may be employed as well, such as processing recorded ambient noise to detect and record particular noise patterns, such as the sound of an espresso machine. Once analyzed, the conclusions of the analyzed sensor data and mobile device settings may be stored in the avatar selection logic table 601 in a data record including the avatar name, step 620. The avatar selection logic table 601 may be stored in memory 392 of the mobile device 301. The user may repeat the process of selecting an avatar for calibration, engaging in the associated activity while allowing the mobile device 301 to perform a calibration routine and storing the analyzed sensor data in the avatar selection logic table 601 until criteria have been saved for all of the user's avatars. Optionally, when the avatar selection logic table 601 is completed, the table may be transmitted to another computing device, such as a server 109 on which the user's avatar is hosted, step 622.
  • Users may repeat the process illustrated in FIG. 6 c at any time to update the calibration or criteria for a particular avatar or to add selection criteria for a newly created avatar.
  • Since users may not know the GPS coordinates of their various activities, and are unlikely to be able to accurately estimate ambient noise and accelerometer characteristics of an activity, this self-calibration method simplifies the process of setting up the avatar selection criteria. Further, users may not recognize how various activities impact their mobile device, and thus this embodiment method enables avatars to be selected based on as many sensors and setting values as can be recorded, even those of which the user may not be aware. For example, a coffee shop may have a number of characteristic background noises, such as the sound of an espresso machine, which the user may not notice.
  • The avatar selection avatar selection logic table 601 shown in FIG. 6 b is intended for illustrative purposes only. More or fewer selection criteria may be included in the table depending upon the sensors and settings included in mobile device 301. The avatar selection table may also include fewer parameters if the user decides that the avatar to display can be determined using fewer parameters. The specific parameters associated with each avatar in the avatar selection table are merely illustrative and will be altered according to each individual user's preferences.
  • To select a particular avatar based upon the values stored in the parameter value table 600, a processor (be it in the mobile device 301, a server 109, or another computing device) can compare each value to corresponding criteria in the avatar selection logic table 601. A variety of algorithms may be employed to determine which of the avatar's selection criteria are most closely satisfied by any of the values in the parameter value table 600. In some implementations, a simple sum of the number of satisfied criteria may be sufficient to determine the appropriate avatar to assign. In an embodiment, weighting factors may be applied to selected criteria so that some measured sensor values are given greater weight when selecting an avatar. In another embodiment, one or two criteria may be used to make a preliminary determination of the current status, followed by a comparison of parameter values against confirmatory criteria. For example, GPS and calendar data may be used as primary indicators of particular activities, with noise, light and accelerometer data used to confirm the activity indicated by GPS location or calendar entries. For example, the GPS values stored in the parameter value table 600 most closely matches the criteria for the running avatar, data record 618. The running avatar can then be confirmed as an appropriate selection by comparing the accelerometer data with the corresponding criteria in the avatar selection logic table 601. In this example, the accelerometer data distinguishes the activity from driving by or walking near the track. By making such comparisons of the values stored in the parameter value table 600 to the criteria in the avatar selection logic table 601, a processor can determine that the “Running” avatar should be displayed. In another embodiment, the mobile device 301 may be configured with software instructions to ask the user whether it has correctly diagnosed the current activity (such as running), or ask the user to name the current activity. This method can enable the mobile device 301 to “learn” when parameters meet the desired criteria. Alternatively, in embodiments where avatar files and avatar selection logic tables are hosted on a remote server, the avatar selection logic tables of other previous users may be used to populate a new avatar selection logic table for a new user. For example, if a number of previous users have assigned a “jogging” avatar to a set of sensor data which includes a particular GPS location (track), accelerometer readings, noise, light, etc. sensor reading, then an artificial intelligence routine running on the server may recommend to the new user the “jogging” avatar when the same or similar set of sensor data is generated during a calibration routine. The artificial intelligence routine may analyze each of the hosted avatar selection logic tables to identify patterns or commonalities of assigned avatars and corresponding sensor data. By identifying these patterns, the server may recommend avatar based upon the sensor data gathered during a calibration process.
  • For many activities, the GPS sensor location and velocity data will provide a good indication of the avatar to display. However, in many situations multiple avatars may be associated with the same GPS location. For example, in data records 610 and 611 in the avatar selection logic table 601, both includes GPS location criteria of the user's office. This ambiguity may be resolved by considering the ambient noise level, which if it exceeds 50 dB would indicate a meeting, or considering the ringtone setting, which if it is set to “silent” would also indicate a meeting, indicating that the “Meeting” avatar should be displayed instead of the “Work” avatar. Alternative embodiments may ask the user (such as by way of a prompt presented on the display) the nature of an activity in order to learn and associate the recorded sensor and setting data with the specific activity. Instead of asking the user to define the nature of the activity, the mobile device 301 may ask the user to identify a particular avatar to be associated with the present activity in the future. In this manner, the displayed avatar may more accurately represent the user's status.
  • FIG. 7 is a process flow diagram of an alternative embodiment which conserves processing power of the mobile device by including a decision step 405 which determines if any of the parameters stored in parameter value table 600 has changed. If the sensor values and device settings are the same as those already stored in the parameter value table 600, in other words if no parameter value has changed, then no change in the avatar is required. Accordingly, the steps of selecting an avatar for display and transmitting the selection to a server (steps 410-415) do not need to be performed. Accordingly, if no parameter has changed in step 405, then the processor can return to the process of polling sensors and checking settings, steps 401-409. Optionally, steps 401-409 may be repeated after some predetermined delay, step 450, to reduce the amount of processing time dedicated to monitoring sensors. When the processor determines that a value in the parameter value table 600 has changed, then the method can continue on to select a particular avatar for display and transmit the avatar to a server, steps 410-441, in a manner similar to that described above with reference to FIG. 5.
  • FIG. 8 is a process flow diagram of an embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings and selecting an avatar for display, steps 401-415 when prompted by the server 109 hosting the user's webpage. In this embodiment, the server 109 periodically transmits a request to the mobile device 301 requesting it to transmit the current avatar, step 455. The process of requesting an update from the mobile device 301 is referred to in the figures as a “ping” of the mobile device 301 The server 109 may send a request for the avatar to the mobile device 301 periodically or at pre-determined times in order to maintain on the user's website an avatar reflecting the current status of the user. In response to receiving a request for an avatar or a “ping” from the server 109, the processor 391 of the mobile device 301 conducts the process of polling sensors, step 401, checking calendar data, step 402, checking device settings, step 403, storing the data in a primer value table, step 409, selecting an avatar for display by comparing the parameter values to avatar selection, steps 410, 412, and transmitting the selected avatar to the server 109, step 415, all in a manner similar to that described above with reference to FIG. 5.
  • Depending upon the periodicity of“pings” from the server 109, step 455, the user may complete and change activities between avatar updates. As a result, the displayed avatar may not always accurately represent the user's current status. The currency of the displayed avatar can be enhanced by increasing the frequency of avatar update requests sent to the mobile device, step 455, but at the cost of additional processing overhead. Similarly, by increasing the period between avatar update requests sent to the mobile device, step 455, mobile device 301 may reduce processing overhead. Thus, by varying the periodicity of avatar update requests sent to the mobile device, step 455, a trade-off can be managed between currency and mobile device processor overhead.
  • FIG. 9 is a process flow diagram of another embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings and selecting an avatar for display, steps 401-415, when the server 109 receives a request for the user's avatar. When someone would like to view the user's avatar, they may send a request to access the user's webpage containing the avatar to the server 109, step 440. When the server 109 receives the request for the user's avatar, step 431, the server 109 sends an avatar update request to the mobile device, step 455. In response to receiving a request for an avatar or a “ping” from the server 109, the processor 391 of the mobile device 301 conducts the process of polling sensors, step 401, checking calendar data, step 402, checking device settings, step 403, storing the data in a primer value table, step 409, selecting an avatar for display by comparing the parameter values to avatar selection, steps 410, 412, and transmitting the selected avatar to the server 109, step 415, all in a manner similar to that described above with reference to FIG. 5. Upon receiving the selected avatar file from the mobile device, step 420, the server 109 can insert the avatar into the user's webpage, step 430, and transmit the webpage including the avatar to the requester, step 432, where it can be displayed by the requester's browser, step 441. In this manner, the mobile device 301 only has to poll the sensors and retrieve calendar and device settings data when a second user wants to view the user's avatar. This minimizes the processing resources of the mobile device 109 dedicated to providing current status avatars. While the embodiment illustrated in FIG. 9 may result in a slight delay before the requester can view the avatar, the resulting avatar will accurately reflect the user's current status.
  • FIG. 10 is a process flow diagram of an embodiment in which the selection of the avatar is made by the server 109 which hosts the user's webpage. In this embodiment, the process of polling sensors, checking calendar data and recording device settings in any parameter value table, steps 401-409, are substantially the same as those described above with reference to FIG. 5. Instead of comparing the parameter values to avatar selection criteria within the mobile device 301, the mobile device transmits the parameter value table to the server 109, step 416. As discussed above, the data is saved in the parameter value table 600 and transmitted to the server 109 in step 416 may be processed data or raw sensor data, depending upon implementation. Once the parameter value table 600 is transmitted to the server 109, step 416, the processor 391 of the mobile device 301 may repeat the process of polling sensors, etc., steps 401-409, so that the sensor and setting data reflecting the user's current status is transmitted periodically to the server 109. Optionally, the mobile device processor 391 may pause, step 450, before repeating polling and recording steps 401-404.
  • The parameter value table 600 is received by the server 109, step 417, where the table may be stored in hard disk memory. In the alternative embodiment, mobile device 301 may transmit sensor parameter values sequentially to the server 109, which then can populate the parameter value table 600 as the data is received in step 417. Once the parameter value table 600 has been received, the values may be compared to avatar selection criteria in the avatar selection logic table 601, step 418. The process of comparing parameter values to the avatar selection criteria may be accomplished in the server 109 in a manner substantially similar to that described above with reference to step 411 and FIG. 5. Based upon the comparison in step 418, the server 109 selects an appropriate avatar for display, step 419, and prepares the avatar for display, such as by inserting the avatar into the user's webpage hosted by the server 109. By keeping the user's parameter value table 600 up-to-date, an avatar reflecting the user's current status will be displayed when, in response to someone accessing the avatar, step 440, the server 109 transmits the avatar, step 432, for display on the requester's computing device, step 441. This embodiment obviates the need for the mobile device to select the appropriate avatar and transmit the avatar to the server, thereby reducing processor overhead and saving power. Because most mobile devices 301 are limited in their processing power and battery potential, offloading the avatar selection to the server 109 processor can enable the mobile device 301 operate longer on a single battery charge.
  • FIG. 11 illustrates an alternative embodiment in which the avatar selection is performed by a server 109, but the mobile device 301 only transmits sensor, calendar and setting data if such data has changed. The processing of this embodiment is substantially the same as described above with reference to FIG. 10 with the addition of a test, step 405, to determine if any parameter values have changed since the last transmission of the parameter value table, step 416. If no parameter values have changed, then the processor 391 of the mobile device 301 repeats the process of polling sensors etc., steps 401-409, after an optional pause, step 450. Only if a parameter value has changed does the mobile device transmit the updated parameter value table to the server 109, step 416. This embodiment has the added advantage of transmitting the parameter value table only when an update to the values stored on the server 109 is required. Thus, this embodiment further saves on mobile device battery use.
  • FIG. 12 illustrates an alternative embodiment in which the avatar selection is conducted by the server 109 periodically requesting updates from the mobile device 301. In a manner similar to that described above with reference to FIG. 8, the server 109 periodically requests an update of the parameter value table 600, step 455. In response, the mobile device 301 polls sensors, etc. and transmits the parameter value table 600 to the server 109 as described above with reference to steps 401-416 in FIG. 10. The rest of the steps in this embodiment are substantially the same as described above with reference to FIG. 10. This embodiment further reduces mobile device 301 battery consumption by limiting the polling of sensors, etc. to a periodicity that is controlled by the server 109. As explained above with reference to FIG. 8, the trade-off between avatar currency and battery consumption can be controlled by varying periodicity of requests made to the mobile device 301. Alternative embodiments may also employ sensors which “sleep” and awaken when only when some parameter (e.g., noise, light, acceleration, change of location, etc.) is detected.
  • FIG. 13 illustrates another alternative embodiment in which the avatar selection is conducted by the server 109 in response to requests from others to view the avatar. In a manner similar to that described above with reference to FIG. 9, when someone sends a request to access the user's avatar, step 440, the server 109 receives the request, step 431, and in response sends an avatar update request to the mobile device, step 455. In response, the mobile device 301 polls sensors etc., and transmits the parameter value table 600 to the server 109 as described above with reference to steps 401-416 in FIGS. 10 and 12. The rest of the steps in this embodiment are substantially the same as described above with reference to like numbered steps in FIG. 10. This embodiment even further conserves battery power by limiting the polling of sensors and transmission of data to occasions when someone accesses the user's avatar.
  • The foregoing embodiments were described with reference to a server 109 which serves as a host or access point for a user's avatar. These embodiments make use of the current Internet architecture in which avatars are maintained on servers to provide their accessibility. However, alternative embodiments may permit the display of a user's avatar on a second user's computing device (e.g., 113-117) without the need to access a server 109. In these embodiments, avatar files are stored on either the first user's mobile device 301 or the second user's device 313-317, or both. Because the storage and display of avatar files may require significant memory storage space and processor time (particularly if the avatar file is three-dimensional or animated) pre-authorization to request and receive avatar files between the users may be desired. An illustrative embodiment of such a method to display a user's avatar is shown in FIG. 14 a.
  • Referring to FIG. 14 a, the processing within the mobile device 301 is substantially similar to that described above with reference to FIG. 5 up to the point where the avatar is selected, step 411. The mobile device 301 may continuously poll the sensors, step 401, and checks calendar and settings data, steps 402, 403, to collect data regarding the user's current status. The gathered data can be stored in a parameter value table 600, step 409, and compared against an avatar selection avatar selection logic table 601, step 410. Based upon the comparison, an avatar file to display is selected, step 411. In this manner, the avatar selection is continuously updated so that the selected avatar reflects the user's current status. In order to conserve battery power and reduce processor overhead, an optional pause or delay, step 450, may be taken between polling cycles.
  • The foregoing process steps ensure that the mobile device 301 has a current avatar selection stored in memory. A request for the avatar may be sent by a second user, step 460, by e-mail, SMS message and/or a phone call. In response to receiving a request for the avatar, step 461, the mobile device 301 processor 391 recalls the avatar file from memory and transmits the file to the requester, step 462. Upon receiving the avatar file, step 463, the requesting computing device can then display the avatar, step 441.
  • In an embodiment the processor 391 may compare the source (e.g., return address or telephone number) of the contact request against a list of pre-approved second user devices (e.g., 113-117) to determine if the requesting device is authorized to receive the user's avatar. This list of preapproved users can be similar to a list of friends and family telephone numbers and e-mail addresses.
  • In an embodiment, a plurality of the user's avatar files can be pre-stored on the device initiating contact, such as on all computing devices that are preapproved to receive the user's avatar. Unlike the foregoing embodiment where the entire avatar file is transmitted to the requester, only the avatar name needs to be transmitted in order for any requester to recall the avatar file from its own memory. By pre-storing avatar files on pre-approved computing devices, the delay between the avatar request and its presentation to the requester or is minimized since only the avatar name is transmitted. However, such embodiments may require significant storage requirements on multiple devices as well as time required to download all of the user's avatar files to each preapproved computing device.
  • In an alternative embodiment, the avatar files may be pre-stored on the pre-approved computing device as well as directly transmitted to the pre-approved computing device. In order to minimize power consumption required by both the mobile device 301 and the requesting device, only an identifier of the selected avatar file is transmitted to the second user's device. The alternative embodiment checks the local memory of the requesting device to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display. FIG. 14 b illustrates an alternative embodiment which is substantially the same as the embodiment described above with reference to FIG. 14 a with the exception that an avatar file identifier (ID) is transmitted to the requesting device in step 504. The requesting device receives the avatar ID, step 506, and uses the ID to determine if the associated avatar file is already stored in its memory, test 507. If the avatar file corresponding the the ID already exists in the local memory (i.e., test, 507=Yes), the avatar is promptly displayed, step 441. However, if the avatar file corresponding to the ID is not already in memory (i.e., test 507=No), the requesting device requests the file to be transmitted, step 508. The request is received by the mobile device 301, step 505, and the corresponding avatar file is transmitted, step 462. Thereafter the requesting device receives the avatar file, step 463, and displays it, step 441. As an optional step, not shown, the received avatar file may be stored in local memory for future use. In this manner the alternative embodiment allows for the display of new or updated avatar files. In addition, power consumption is conserved by both the mobile device 301 and the requesting device by only requiring the transmission of a large avatar file when the requesting device does not already have the avatar file in local memory. By limiting the amount of data transmitted, the alternative embodiment also conserves transmission bandwidth.
  • FIG. 15 a illustrates an alternative to the foregoing embodiment shown in FIG. 14 a, which includes a test, step 405, to determine if any a parameter has changed. As described more fully above with reference to step 405 in FIG. 7, if no parameter has changed since the last avatar selection, step 411, then the processor 391 may continue to collect sensor, calendar and settings data, steps 401-409. If a parameter has changed, then the method may continue as described above with reference to like numbered steps in FIG. 14.
  • Similar to the manner in which the embodiment illustrated in FIG. 14 b modified the method of FIG. 14 a, the embodiment shown in FIG. 15 b modifies the method shown in FIG. 15 a. The alternative embodiment shown in FIG. 15 b conserves processing power, battery life and transmission bandwidth and further allows for the use of updated or new avatar files. Similar to the embodiment shown in FIG. 14 b, the embodiment shown in FIG. 15 b modifies the method shown in FIG. 15 a by further including steps 504-508 which transmit only an identifier of the selected avatar file to the second user's device. The local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • FIG. 16 illustrates an alternative embodiment which further conserves processor and battery time of the mobile device 301 by responding to an avatar request made by a second user. An avatar request may be transmitted to the mobile device 301 by a second user, step 460. The avatar request is received by the mobile device 301, step 461, and in response the processor 391 gathers and stores sensor, calendar and settings data, steps 401-409, as described above with reference to FIG. 5. The processor 391 then compares the gathered data in a parameter value table 600 to an avatar selection logic table 601, step 410, to select an avatar for display, step 411. The processor 391 then transmits the selected avatar file to the requesting computing device, step 415, which receives the avatar file, step 462, and displays the selected avatar, step 441.
  • In an embodiment the processor 391 may compare the source (e.g., return address or telephone number) of the contact request against a list of pre-approved second user devices (e.g., 113-117) to determine if the requesting device is authorized to receive the user's avatar. This list of preapproved users can be similar to a list of friends and family telephone numbers and e-mail addresses.
  • In an embodiment, a plurality of the user's avatar files can be pre-stored on the computing device initiating requesting an avatar. For example the user's avatar files may be stored on all computing devices that are preapproved to receive the user's avatar. Unlike the foregoing embodiment where the entire avatar file is transmitted to the requester, only the avatar name needs to be transmitted in order for any requester to recall the avatar file from its own memory. By pre-storing avatar files on pre-approved computing devices, the delay between the avatar request and its presentation to the requester is minimized since only the avatar name is transmitted. However, such embodiments may require significant storage requirements on multiple devices as well as time required to download all of the user's avatar files to each preapproved computing device.
  • Alternatively, similar to the embodiments shown in FIG. 14 b and 15 b, the embodiment shown in FIG. 16 b modifies the embodiment method shown in FIG. 16 a by adding steps 504-508 which transmit only an identifier of the selected avatar file to the second user's device. The local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display.
  • In an embodiment illustrated in FIG. 17 the avatar files and avatar selection logic table 601 are pre-stored on computing devices that are authorized to request the user's avatar. In this embodiment, a preapproved computing device requests an avatar, step 460, such as by calling, sending an e-mail, or sending an SMS message to the user. In response to receiving the request for an avatar, step 461, the processor 391 of the mobile device 301 polls sensors, checks calendar data and checks device settings, steps 401-403, and stores the data in a parameter value table 600, step 409, and as described more fully above with reference to FIG. 5. The processor 391 then transmits the parameter value table 600 to the requesting computing device, step 416. The requesting computing device receives the parameter value table, step 462, and compares the values to an avatar selection logic table 601 stored on the computing device, step 464, in order to select the appropriate avatar for display, step 466. Having selected the appropriate avatar, the computing device then calls the avatar from its memory, and displays it, step 441. The process by which the avatar-requesting computing device compares parameter values to avatar selection criteria, steps 464 and 466, are substantially the same as similar steps described above as performed by the mobile device 301 or server 109.
  • In alternative embodiments, the user may also set authorization selection criteria to control who may view each avatar, at what times, and during what activities. A user may provide a plurality of avatars corresponding to the same sensor settings but differing depending upon the identity of a requester (i.e., the second user making the request). Some avatars may provide less detail regarding the user's exact activity or may differ significantly from the activity in which the user is actually engaged. Further, a user may set authorization controls to override sensor information to ensure the displayed avatar does not correspond to the user's actual activity.
  • For example, a user may set authorization levels to ensure that the user's boss may only view an avatar detailing the user's activity during work hours. At other times the avatar may be denied or hidden, or if an avatar is present, it may be a simple avatar indicating that the user is busy without depicting the activity in which the user is engaged. In another example, a user may set authorization levels (also referred to as permission controls) so that the user's boss may view an avatar depicting the user at work, when the sensor data indicates that the user is at the gym.
  • FIG. 18 illustrates example process steps that may be employed in embodiments which select an avatar based in part upon an authorization level of a requestor. As described above with reference to FIG. 4, the mobile device processor 391 may retrieve various pieces of data regarding the user's current activity, steps 401-403. Once relevant data regarding the user's current activity is collected, the authorization level of the requestor is determined, step 501. Various methods to determine the requestor's authorization level are described in more detail below. Once the authorization level of the requestor is determined it may be used as another parameter to select the appropriate avatar to display. In embodiments where the avatar is sent directly from the mobile device 301 to the requestor's device, the mobile device's processor 391 may check the authorization level of the requestor. In instances where the avatar is stored and inserted into a webpage at a central server location, the processor of the server or the processor of the mobile device may perform the step of checking the authorization level of the requestor.
  • Any of a variety of methods may be implemented to check a requestor's authorization level. For example, a requestor may be asked to provide some form of authentication credential, such as a user name and password to verify that the requestor is authorized to receive an avatar and/or has a particular authorization level. Alternatively, some specific computing devices may be authorized to view selected avatars. The processor may check a static internet protocol (IP) address of computing device submitting a request for the avatar to determine if the computing device is authorized to receive the avatar, such as by comparing the static IP address received in the avatar request message to a list of static IP addresses authorized to receive the avatar. Any method which authenticates a requestor or a computing device transmitting a request to receive an avatar as an authorized user/device or categorizes a requestor or device into various authorization levels may be used in the various methods to perform the step of checking the authorization level of a requestor. Once the determination is made of whether the requestor is authorized to receive a particular avatar (or the level of the requestor's authorization), this criterion may be entered into the parameter table as another criterion to determine which avatar to display or transmit to the requestor.
  • Using information gathered from the various mobile device sensors, the various mobile device settings and calendar data, a processor can infer a user's current status. Combining this inference with the requestor's authorization level, the processor may select an avatar to display in accordance with the user's preferences, step 404. As discussed above with respect to the various embodiments illustrated in FIGS. 5-17, a variety of processors may be used to make this determination. For example, the determination of which avatar to display may be made by the processor of the mobile device 301, the server 109, or the requestor's computing device. In addition, once the appropriate avatar has been selected, it may be displayed in any of the ways described above. For example, the avatar may be displayed as part of a webpage hosted by a server, displayed directly on the requestor's computing device, or attached as an image file sent in an e-mail, SMS or other message.
  • Similar to embodiments discussed above, data from mobile device sensors (e.g., 350-356), the user's calendar, and the mobile device settings can be stored in a parameter value table 602. Such data may be stored in the form of absolute values (i.e., the raw sensor information), or in the form of processed information (i.e., interpreted sensor information). This distinction turns on the amount of processing of the information that is done before it is stored in the parameter value table 602. In addition, the parameter table 602 may include an additional column for recording whether the requestor is authorized or the requestor's authorization level, if there are more than two levels. For example, FIG. 19 a illustrates a parameter value table 602 similar to the parameter value table described above with reference to FIG. 6 with the addition of a column titled “authorization level” storing the results of the authorization level check performed in step 501. In the illustrated example the authorization level check has determined that the requestor is authorized.
  • With sensor, calendar, setting data and the authorization level of the requestor stored in a parameter value table 602, this information can be compared to criteria stored in an avatar selection logic table 603. An illustrative example of an avatar selection logic table 603 is shown in FIG. 19 b, which is similar to the selection logic table 601 described above with reference to FIG. 6 b. The avatar selection logic table 603 of FIG. 19 b includes an additional parameter column which records selection criteria relating to the authorization level of the requestor.
  • As described above with reference to FIG. 6 b, the avatar selection logic table 603 shown in FIG. 19 b may be stored in memory of the computing device which determines the appropriate avatar to display. As previously discussed with reference to FIG. 6 b, using an avatar selection logic table 603 to perform the avatar selection provides users with flexibility and control over the selection process. Users may customize the avatar selection logic table 603 such that the avatar selected for display is chosen based upon a variety of parameters including the authorization level of the requestor. The avatar selection logic table 603 provides users with the additional option of assigning different avatars for display based on the authorization levels of the requestor. This customization of the avatar selection logic table 603 may be accomplished during a user setup process. Any of the setup methods discussed above for populating the avatar selection logic table 601 may be implemented to construct the avatar selection logic table 603.
  • The authorization level may be simply a binary level denoting whether a requestor is authorized. In such a binary system, two different avatars may be displayed for identical selection criteria of sensor, calendar and settings data depending on whether the requestor is authorized. For example, a more general avatar may be displayed in instances where the requestor is not authorized while a detailed or more accurate avatar may be displayed for the identical selection criteria of sensor, calendar and setting data if the requestor is authorized. The first user may simply elect to assign a value of “no avatar,” meaning that no avatar is to be displayed or transmitted if the requestor is not authorized. Alternatively, a first user may set multiple levels of authorization, each resulting in the display of a different avatar for identical selection criteria of sensor, calendar and settings values.
  • In the example illustrated in FIG. 19 b, for identical sensor, calendar and setting data the user has defined in data records 650 and 651 two different avatars entitled “work” and “meeting,” respectively. The avatar selection process is accomplished in the same manner as described above with respect to avatar selection table logic 601. In this example, selection criteria for both data records 650 and 651 include: a location at the office and low velocity as recorded by a GPS sensor; low ambient noise; “bright” ambient light conditions; zero or low accelerometer readings (e.g., consistent with sitting or walking); calendar data indicating that a meeting is scheduled and professional wallpaper settings (such as the company logo). The “work” and “meeting” avatar selection criteria may not include values for some parameters which the user anticipates are not likely to help resolve the user's status for that particular avatar. For example, the user has decided that the ambient temperature and ring tone values provide no additional value for selecting either the work or meeting avatar. If the sensor and setting data stored in the parameter value table 602 match the criteria in data records 650 and 651, a different avatar will be selected and displayed depending on the authorization level of the requestor (in this case whether the user is authorized or not). If the requestor is not authorized (i.e., authorization level=“no”), this may mean that the requester is merely a member of the general public not known to the user. For such general public members, the user may want to indicate that the user is at work and not disclose the exact activity in which the user is currently engaged. Thus, while the calendar parameter may indicate that the user is currently in a “Board Meeting,” the avatar displayed to an unauthorized requester is the more generic “Work” avatar of the user engaged in the user's occupation.
  • In contrast, if the requester is authorized (i.e., authorization level stored in table 602 is “yes”), this may mean that the requestor is a co-worker, boss, or family member (for example) to which the user wants to disclose an accurate avatar. For such requesters, the user may wish to accurately indicate the activity in which the user is engaged so that more information will be conveyed to a known requestor. Thus, if the requestor is authorized, the “meeting” avatar may be displayed showing the user engaged in a meeting or presentation.
  • In the case of data records 652 and 653, the selection criteria include: a location at home and low velocity as recorded by a GPS sensor; “dark” ambient light conditions; zero accelerometer readings (e.g., consistent with sitting or sleeping). Both data records 652 and 653 may not include values for some parameters which the user anticipates are not likely to help resolve the user's status for that particular avatar. For example, the user has decided that the ambient temperature, calendar data, wallpaper and ring tone values provide no additional conference value for selecting either the avatar to display. Rather, the user may feel that if the user is at home, the user is likely sleeping. However, the user may not want to indicate to co-workers or more specifically the user's boss that the user is sleeping. Thus, only authorized requestors, such as friends and family members, will receive the “sleeping” avatar. Non-authorized requesters such as the user's boss will simply be receive a “busy” avatar according to the selection criteria in data record 653.
  • A user may program the avatar selection logic table 603 with multiple levels of authorization such that more than two different avatars may be displayed for the identical sensor, calendar and settings data depending upon the authorization level of the requestor. For example, in data records 654-656, the selection criteria includes: a location at the golf course and low velocity as recorded by a GPS sensor; “Bright” ambient light conditions; zero to low accelerometer readings (e.g., consistent with walking or riding in a golf cart); ambient temperature being greater than 75 degrees; and a calendar setting indicating a weekday. In such circumstances, the user may not wish to inform either the user's spouse or boss that the user is golfing on a weekday during business hours. Accordingly, a requester whose authorization level indicates that the requestor is on the user's buddy list will be sent the “golfing” avatar (see data record 655). However, the user's spouse may have a authorization level of “family” which causes the “busy” avatar to be selected for display. Additionally, the user's boss may have an authorization level which would cause a “work” avatar to be selected for display.
  • Users may populate the avatar selection logic table 603 in a manner similar to that described above with reference to FIG. 6 c. FIG. 19 c illustrates an example embodiment calibration method suitable for completing an avatar selection logic table that includes a requestor's authorization level. To begin the process, a user may select a particular avatar to calibrate, step 610. This avatar may have already been created by the user and given a name. Alternatively, the user may enter a name for an avatar yet to be created. The user then begins the activity and initiates the calibration, such as by pressing a particular key on the mobile handset, step 612. During the activity, the mobile device 301 records sensor data and device settings, step 614. For some sensors this may involve recording sensor readings over a period of time along with the time of each recording in order to be able to recognize time-based patterns. Alternatively, the user may be permitted to take multiple number of sensor readings and average the sensor readings to refine the calibration of parameter readings for an avatar. For example, avatars may be displayed to indicate an increasing or decreasing effort on the part of a user during an exercise session depending on the refined heart rate sensor readings. After a period of time, the user may end the calibration, such as by a pressing a particular key on the mobile device, step 616. Alternatively, the calibration may proceed for a preset amount of time, so that step 616 occurs automatically. After the sensor data and device settings have been recorded, step 614, the user is given the option of adding an authorization level to the data record, step 615. The user may be prompted to select alternative avatars for the recorded sensor and setting data. The user may be further prompted to select an authorization level corresponding the selected alternative avatar such that requestor's possessing the selected authorization level will receive the selected alternative avatar whenever the sensor and settings match the recorded criteria. Once calibration data gathering is completed, the processor 391 of the mobile device 301 can analyze the recorded sensor data using well-known statistical processes as described more fully above with reference to FIG. 6 c.
  • Once analyzed, the conclusions of the analyzed sensor data and mobile device settings may be stored in the avatar selection logic table 603 in a data record including the authorization level and avatar name, step 620. The avatar selection logic table 603 may be stored in memory 392 of the mobile device 301. The user may repeat the process of selecting an avatar for calibration, engaging in the associated activity while allowing the mobile device 301 to perform a calibration routine and storing the analyzed sensor data in the avatar selection logic table 603 until criteria have been saved for all of the user's avatars. This may include multiple avatar settings for different authorization levels. Optionally, when the avatar selection logic table 603 is completed, the table may be transmitted to another computing device, such as a server 109 on which the user's avatar is hosted, step 622. In addition, a learning method as described above with respect to avatar selection logic table 601 may be implemented.
  • Users may repeat the process illustrated in FIG. 19 c at any time to update the calibration or criteria for a particular avatar or to add selection criteria for a newly created avatar.
  • FIG. 20 is a process flow diagram of an embodiment method in which the avatar to be displayed is determined based upon the sensor and setting data as well as the requestor's authorization level. In this embodiment, the mobile device 301 processor 391 performs steps 401-431 in a manner substantially the same as described above with reference to FIGS. 5-12.
  • Once the webpage access request is received, the processor of the server 109 can implement any of a number of methods discussed previously to check the authorization level of the person or device requesting access to the webpage(the “second user”), step 501. Data regarding the second user may be sent from the second user's device back to the server 109 processor to complete the check authorization level step, step 503.
  • Once the authorization level of the second user is determined, this information is stored in the parameter table held in memory of the server 109. The data record of the parameter table is compared against the criteria in an avatar selection logic table 603 stored in memory of the server 109 to determine which avatar to display, step 411. As described more fully above, a variety of method embodiments may be used to evaluate the stored parameter data and select a particular avatar for display. For example, the processor of the server 109 may select the avatar the avatar for which the greatest number of selection criteria are satisfied by the parameters stored in the parameter data table. Once the appropriate avatar has been selected, the processor of server 109 can insert the selected avatar into the webpage and sent to the requestor as described more fully above with reference to FIGS. 5-12 for steps 430-441. Since the polling and analysis of sensor and settings data is performed autonomously, the user's avatar presented to others is maintained consistent with the user's current status without input by the user unless the user has elected to alter the avatar to be selected based upon the requestor's authorization level.
  • FIG. 21 is a process flow diagram of an embodiment method suitable for implementation on a mobile handset which conserves battery and processor time and determines an avatar to display based upon an avatar selection logic table which includes data related to the authorization level of the second user. The embodiment illustrated in FIG. 21 includes steps 401-440 described above with reference to FIGS. 5-12. The processor selects an avatar to display in the same manner as described above with reference to FIG. 20 for steps 501-503 and 411. Once the appropriate avatar has been selected, the processor of server 109 can insert the selected avatar into the webpage and sent to the requestor as described more fully above with reference to FIGS. 5-12 for steps 430-441.
  • FIG. 22 is a process flow diagram of an embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings, steps 401-409, when prompted by the server 109 hosting the user's webpage, step 455. This embodiment includes the steps 401-455 described above with reference to FIGS. 5-12. The processor receives and checks the requestor's authorization level, steps 501-503, and selects the appropriate avatar, step 411, as described above with reference to FIGS. 20 and 21. Once the appropriate avatar has been selected, the processor of server 109 can insert the selected avatar into the webpage and sent to the requestor as described more fully above with reference to FIGS. 5-12 for steps 430-441.
  • FIG. 23 is a process flow diagram of another embodiment which conserves processing power of the mobile device 301 by only polling sensors, checking device settings and selecting an avatar for display when the server 109 receives a request for the user's avatar. The embodiment shown in FIG. 23 operates much in the same manner as the embodiment illustrated in FIG. 9 with the addition of checking the authorization level of the requester, steps 501, 503, described above with reference to FIGS. 20 and 21. The authorization level determined in steps 501 and 503 is transmitted to the mobile device 301 for storage into the parameter table 602, step 409 as described above with reference to FIG. 5-9. As a result, the avatar may be selected based upon the authorization level of the second user. Once the appropriate avatar has been selected, the avatar can be sent to the requester as described more fully above with reference to FIG. 9 for steps 415-441
  • FIG. 24 a is a process flow diagram of another embodiment method suitable for displaying an avatar selected based upon sensor and setting data as well as the authorization level of a second user directly on the requesting device. The embodiment illustrated in FIG. 24 a operates in the same manner as described above with reference to FIG. 14 a with the addition of checking the authorization level of the second user, steps 501 and 503 and storing the authorization level data in the parameter table 602, step 502, described above with reference to FIGS. 20 and 21. Once the authorization level data is stored in the parameter table 602, the complete parameter table 602 can be compared against the avatar selection logic table 603 to select and display the appropriate avatar as described above with reference to FIGS. 14 a, 20 and 21 for steps 411, 432, 463, 442.
  • Alternatively, similar to the embodiments shown in FIGS. 14 b, 15 b and 16 b, the embodiment shown in FIG. 24 b modifies the embodiment method shown in FIG. 24 a by adding steps 504-508 which transmit only an identifier of the selected avatar file to the second user's device. The local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display. The alternative embodiment shown in FIG. 24 b conserves processing power, battery life and transmission bandwidth and further allows for the use of updated or new avatar files.
  • Similar to the embodiment illustrated in FIG. 16 a, the embodiment illustrated in FIG. 25 a conserves processor and battery time of the mobile device 301 by responding to an avatar request made by a second user. The embodiment illustrated in FIG. 25 a operates in the same manner as the embodiment described above with reference to FIG. 16 a with the addition of checking and sending the authorization level of the second user, steps 501 and 503 described above with reference to FIGS. 20 and 21. The authorization level data is stored in the parameter table 602 along with the various sensor and settings data, step 409. Once stored, the complete parameter table 602 can be compared against the avatar selection logic table 603, step 410, to select the appropriate avatar to display, step 411. Once selected, the processor 391 then transmits the selected avatar file to the requesting computing device, step 415, which receives the avatar file, step 462, and displays the selected avatar, step 441 as described above with reference to FIGS. 14-16.
  • Again, similar to the embodiments shown in FIGS. 14 b, 15 b and 16 b, the embodiment shown in FIG. 25 b modifies the embodiment method shown in FIG. 25 a by adding steps 504-508 which transmit only an identifier of the selected avatar file to the second user's device. The local memory of the requesting device is checked to determine whether the transmission of the avatar file is necessary or not. If the avatar file already exists in the local memory of the requesting device, then the avatar file is immediately displayed. If the avatar file does not exist in local memory, then a request for the avatar file is made and subsequently transmitted for display. The alternative embodiment shown in FIG. 25 b conserves processing power, battery life and transmission bandwidth and further allows for the use of updated or new avatar files.
  • Similar to the embodiment illustrated in FIG. 17, the embodiment illustrated in FIG. 26 pre-stores the avatar files and avatar selection avatar selection logic table 603 on computing devices that are authorized to request the user's avatar. While each of the requesting computing devices are previously authorized to request the user's avatar, some computing devices may be authorized to view only certain avatars in response to various sensor and settings data. The embodiment illustrated in FIG. 26 operates in the same manner as described above with reference to FIG. 17 with the addition of checking and sending the authorization level of the second user, steps 501 and 503, as described above with reference to FIGS. 20 and 21. The authorization level data is stored in the parameter table 602 along with the various sensor and settings data, step 409. Once stored, the complete parameter table 602 can be transmitted to the second user's requesting computing device, step 416, and compared against the avatar selection logic table 603 to select and display the appropriate avatar as described above with reference to FIG. 17 for steps 462, 464, 466, 441.
  • In an embodiment, artificial intelligence routines may be implemented on the mobile device to prompt users to select an avatar to display when repeated parameter patterns are recognized. For example, if a mobile device continuously polls the GPS sensor 354 and an artificial intelligence application notices that the device is at the same GPS location coordinates between 9 am and 5 pm during weekdays, the processor may prompt the user to identify the name of the location, and suggest a name of “Work”. Alternatively, if the processor 391 notices that the user's pulse detected by a pulse sensor (not shown) has risen above 100 beats per minute, the processor 391 may prompt the user to identify the user's current activity as “exercise.” At the same time, if the processor 391 recognizes from the GPS sensor that the mobile device 301 is moving at a speed greater than 3 miles per hour, the processor 391 may prompt the user to identify the user's current activity as “Running.”
  • In embodiments where the number of avatar files stored in memory is limited, such as when the avatar files are stored directly on the computing device that will display them, fewer and more generic avatars may be desired. In such instances, fewer parameters may be needed to accurately reflect a user's current status. Conversely, if avatar files are stored on a computing device with greater storage and processing capabilities, such as server 109, the number of avatar files may be increased, as well as the level of precision in matching an avatar with the user's current status.
  • In further embodiments, parameter data may cause an avatar to change consistent with changes in the user's activity. By refining the avatar selection table 601, 603, varying avatars may be displayed in response to changes in parameter data. For example, if a user is running as inferred by the GPS sensor 354 measuring a velocity of about 6 miles per hour and an accelerometer 353 indicating a periodicity of accelerations consistent with running, the avatar selected for display may be an animated image of a runner. As the GPS sensor 354 records increased, the avatar selected for display may show the running image moving faster, and/or showing the increased effort by displaying an avatar that is running, sweating and breathing harder. To include such additional avatars, including animated avatars, simply requires including another line in the avatar selection logic table 601, 603 linking the increased speed as a requirement to display a new avatar file.
  • By implementing the various methods disclosed herein, a first user can provide a second user with an accurate representation of the first user's current activity. For example, in an internet forum setting, the displayed avatars could dynamically change as the user changes status. In conventional usage, a user may pro-actively change his/her avatar that is displayed to other members of the internet forum. However, the avatar will only change if the user proactively changes the file selected for display. In the embodiments disclosed herein, other members of the internet forum may observe the user's avatar change automatically as the user changes activity or status. In this way, the user no longer has to actively alter the avatar to be displayed in order to reflect his or her status. One of ordinary skill in the art would appreciate that similar applications can be implemented in instant messaging, text messaging, or even regular phone call situations.
  • The various embodiments provide a number of new applications for mobile devices. Such applications include improving communications with colleagues, monitoring activities of children, broadening participation in games, and medical monitoring.
  • As mentioned above, an avatar can quickly communicate information regarding a user since “a picture is worth a thousand words.” Users may select avatars and program the avatar selection criteria so that colleagues can quickly determine their status prior to sending an e-mail or making a telephone call. For example, if a colleague has access to a user's avatar, a quick view of the avatar prior to sending an e-mail or placing a telephone call will inform the colleague whether the user is involved in some activity that will preclude a prompt reply, such as out for a run, in a meeting, or on vacation. Access links to avatars (e.g., a hyperlink to an IP address hosting the avatar) may be incorporated into address books so that if an individual has been given access rights to a user's avatar the individual can check on the user's status prior to or as part of sending an e-mail or placing a telephone call. In this manner, the user of an avatar can proactively inform selected colleagues of the user's status. For example, by posting an avatar showing the user is in a meeting (or on travel or vacation), those who may want to contact the user will be informed that a call will not be answered and e-mail may not be promptly read.
  • In an application of the various embodiments, parents may be able to keep track of children when they are out of their sight. For example, children wearing a mobile device configured according to one or more of the embodiments can be tracked by parents accessing a website that displays avatars of their children including their location and present activity. For example, children involved in a game of “hide ‘n’ seek” may be monitored by their parents viewing a map of the play area which includes avatars indicating the location and movement/status of each child.
  • In game settings, the displayed avatar may be more closely linked to the movements and activities of the user's real world movements and activities. For example, the various embodiments may enable a user to be involved in a game of paintball while spectators watch the paintball match in a virtual world representation. Each participant can be equipped with a mobile device 301 including a suite of motion, position and location sensors, each reporting sensor data in near-real time to a central server 109. Additionally, position sensors may be outfitted on users' limbs and coupled to the mobile device by a wireless data link (e.g., Bluetooth) to provide data on the posture and movements of participants, such as the direction in which an individual is facing or aiming a paintball gun. An avatar representing each paintball participant can be generated in the virtual world representation of the match, with each user's avatar changing location and activity (i.e., running, sitting, hiding) based on the mobile device 301 sensor data which are sampled in real time. The virtual world avatar representations may therefore, accurately mimic the movement and activity of the real world users carrying the mobile devices 301.
  • Clearly, the same settings that apply to games could be transferred to training exercises of a national defense nature.
  • In a medical monitoring application, medical sensors on the mobile device 301 or connected to a processor by wireless data links (e.g., Bluetooth) can report their data (e.g., through the mobile device 301) to a system that uses such information to select an avatar that reflects the patient's current status. In a medical setting, the processor need not be mobile, and instead may be associated with a facility, such as an emergency room or hospital information system. Sensor data associated with patients can be received from a variety of medical sensors coupled to each patient, such as blood pressure, pulse, EKG, and EEG sensors for example. Avatar selection criteria associated with each of the sensors may be used to select an avatar that reflects a patient's medical needs or condition. For example, if a medical sensor provides data that satisfies an avatar criteria for a patient in distress, a processor can select an avatar consisting of the patient's photograph with a red background, and display that avatar on a nursing station. The use of such an avatar can more efficiently communicate critical information than text (e.g., the patient's name and the medical data) presented on the screen. As another example, a pacemaker may be configured to transmit information regarding the condition of the device or the patient's heart to a mobile device, such as by means of a Near Field Communications data link, which can relay the data to a server accessible by the patient's doctor. That server can use the patient's pacemaker data to select an appropriate avatar to efficiently communicate the patient's status to the doctor.
  • The hardware used to implement the foregoing embodiments may be processing elements and memory elements configured to execute a set of instructions, wherein the set of instructions are for performing method steps corresponding to the above methods. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
  • Those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
  • The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software module may reside in a processor readable storage medium and/or processor readable memory both of which may be any of RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other tangible form of data storage medium known in the art. Moreover, the processor readable memory may comprise more than one memory chip, memory internal to the processor chip, in separate memory chips, and combinations of different types of memory such as flash memory and RAM memory. References herein to the memory of a mobile handset are intended to encompass any one or all memory modules within the mobile handset without limitation to a particular configuration, type or packaging. An exemplary storage medium is coupled to a processor in either the mobile handset or the theme server such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC.
  • The foregoing description of the various embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein, and instead the claims should be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (62)

1. A method for automatically updating an avatar to indicate a mobile device user's status, comprising:
polling sensors connected to the user's mobile device;
comparing sensor data to avatar selection criteria; and
selecting an avatar for display based upon the comparison of the sensor data to the avatar selection criteria.
2. The method of claim 1, further comprising:
comparing calendar data to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of calendar data with avatar selection criteria.
3. The method of claim 1, further comprising:
comparing mobile device settings to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria.
4. The method of claim 1, further comprising:
comparing an authorization level of a second user to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the authorization level of the second user with avatar selection criteria.
5. The method of claim 1, wherein the steps of polling sensors, comparing sensor data to avatar selection criteria and selecting an avatar for display are performed in response to a request for an avatar received from a server.
6. The method of claim 1, wherein the steps of polling sensors, comparing sensor data to avatar selection criteria and selecting an avatar for display are performed in response to a request for an avatar received from another computing device, and further comprising transmitting the selected avatar to the other computing device for display.
7. The method of claim 5, wherein the server requests an avatar from the mobile device only when a request for the avatar is received by the server.
8. The method of claim 1, further comprising:
transmitting an identifier of the selected avatar to a server; and
recalling an avatar file from the server's memory using the identifier.
9. The method of claim 1, further comprising transmitting the sensor data to a server, wherein the steps of comparing the sensor data to avatar selection criteria and selecting an avatar for display based upon the comparison are performed on the server.
10. The method of claim 9, further comprising inserting the selected avatar into a webpage hosted by the server.
11. The method of claim 9, further comprising:
transmitting calendar data to the server;
comparing calendar data to avatar selection criteria at the server; and
selecting the avatar for display further based upon the comparison of calendar data with avatar selection criteria performed at the server.
12. The method of claim 9, further comprising:
transmitting mobile device settings to the server;
comparing mobile device settings to avatar selection criteria at the server; and
selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria performed at the server.
13. The method of claim 1, further comprising transmitting the sensor data to another computing device, wherein the steps of comparing the sensor data to avatar selection criteria and selecting an avatar for display based upon the comparison are performed on the other computing device.
14. The method of claim 13, further comprising:
transmitting calendar data to the other computing device;
comparing calendar data to avatar selection criteria at the other computing device; and
selecting the avatar for display further based upon the comparison of calendar data with avatar selection criteria performed at the other computing device.
15. The method of claim 13, further comprising:
transmitting mobile device settings to the other computing device;
comparing mobile device settings to avatar selection criteria at the other computing device; and
selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria performed at the other computing device.
16. The method of claim 13, further comprising;
obtaining an authorization level of the other computing device;
comparing the authorization level of the other computing device to avatar selection criteria at the other computing device; and
selecting the avatar for display further based upon the comparison of the authorization level of the other computing device with avatar selection criteria performed at the other computing device.
17. The method of claim 1, wherein the step of polling sensors connected to the user's mobile device comprises obtaining data from at least one sensor selected from the group consisting of a global positioning sensor, an accelerometer, a temperature sensor, a biometric sensor, a light sensor, and a noise sensor.
18. A mobile device, comprising:
a processor;
a transceiver coupled to the processor;
a memory coupled to the processor; and
at least one sensor configured to measure a selected parameter from the group consisting of a global positioning sensor, an accelerometer, a temperature sensor, a biometric sensor, a light sensor, and a noise sensor, wherein the processor is configured with software instructions to perform steps comprising:
receiving sensor data from the at least one sensor;
comparing the sensor data to avatar selection criteria; and
selecting an avatar for display based upon the comparison of the sensor data to the avatar selection criteria.
19. The mobile device of claim 18, wherein processor is configured with software instructions to perform steps further comprising:
comparing calendar to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of calendar data with avatar selection criteria.
20. The mobile device of claim 18, wherein processor is configured with software instructions to perform steps further comprising:
comparing mobile device settings to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria.
21. The mobile device of claim 18, wherein processor is configured with software instructions to perform steps further comprising:
comparing an authorization level of a second user to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the authorization level of the second user with avatar selection criteria.
22. The mobile device of claim 18, wherein the processor is configured with software instructions to perform the steps of receiving sensor data, comparing sensor data to avatar selection criteria and selecting an avatar for display in response to a request for an avatar received from a server.
23. The mobile device of claim 18 wherein processor is configured with software instructions to perform the steps of receiving sensor data, comparing sensor data to avatar selection criteria and selecting an avatar for display in response to a request for an avatar received from another computing device.
24. The mobile device of claim 22, wherein processor is configured with software instructions to perform steps further comprising transmitting an identifier of the selected avatar to the server via the transceiver.
25. The mobile device of claim 22, wherein processor is configured with software instructions to perform steps further comprising transmitting the selected avatar to the server via the transceiver.
26. The mobile device of claim 23, wherein processor is configured with software instructions to perform steps further comprising transmitting the selected avatar to the other computing device via the transceiver.
27. The mobile device of claim 18, further comprising a short range wireless transceiver configured to receive sensor data from an external sensor and provide the sensor data to the processor.
28. A mobile device, comprising:
means for sensing a parameter indicative of a user's status;
means for comparing the sensed parameter to avatar selection criteria; and
means for selecting an avatar for display based upon the comparison.
29. The mobile device of claim 28, further comprising:
means for comparing mobile device settings to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria.
30. The mobile device of claim 28, further comprising:
means for comparing mobile device settings to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria.
31. The mobile device of claim 28, further comprising:
means for comparing an authorization level of a second user to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of the authorization level of the second user with avatar selection criteria.
32. The mobile device of claim 28, further comprising means for transmitting the selected avatar to a server.
33. The mobile device of claim 28, further comprising means for receiving sensor data from a sensor external to the mobile device.
34. A tangible storage medium having stored thereon processor-executable software instructions configured to cause a processor to perform steps comprising:
receiving sensor data from at least one sensor coupled to the processor;
comparing the sensor data to avatar selection criteria; and
selecting an avatar for display based upon the comparison of the sensor data to the avatar selection criteria.
35. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to perform further steps comprising:
comparing calendar to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of calendar data with avatar selection criteria
36. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to perform further steps comprising:
comparing mobile device settings to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of mobile device settings with avatar selection criteria.
37. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to perform further steps comprising:
comparing an authorization level of a second user to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the authorization level of the second user with avatar selection criteria.
38. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to the perform steps of receiving sensor data, comparing sensor data to avatar selection criteria and selecting an avatar for display in response to a request for an avatar received from a server.
39. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to the perform steps of receiving sensor data, comparing sensor data to avatar selection criteria and selecting an avatar for display in response to a request for an avatar received from another computing device.
40. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to perform further steps comprising transmitting an identifier of the selected avatar to a server.
41. The tangible storage medium of claim 34, wherein the tangible storage medium has processor-executable software instructions configured to cause a processor to perform further steps comprising transmitting the selected avatar to a server.
42. A server configured to host a user's social webpage and receive and transmit data via a network, comprising:
a server memory having stored thereon the user's social webpage; and
a server processor coupled to the server memory, wherein the server processor is configured with software instructions to perform steps comprising:
receiving sensor data from the user's mobile device;
comparing the received sensor data to avatar selection criteria;
selecting an avatar for display based upon the comparison of the sensor data with avatar selection criteria; and
including the selected avatar into the user's social webpage stored in the server memory.
43. The server of claim 42, wherein the server processor is further configured with software instructions to perform steps comprising:
receiving calendar data from the user's mobile device;
comparing the calendar data to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the calendar data with avatar selection criteria.
44. The server of claim 42, wherein the server processor is further configured with software instructions to perform steps comprising:
receiving mobile device settings from the user's mobile device;
comparing the mobile device settings to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the mobile device settings with avatar selection criteria.
45. The server of claim 42, wherein the server processor is further configured with software instructions to perform steps comprising:
receiving authorization level data from a second user's computing device;
comparing the authorization level data of the second user's computing device to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the authorization level of the second user's computing device with avatar selection criteria.
46. The server of claim 42, wherein the server processor is further configured with software instructions to perform steps comprising:
receiving a parameter value table from the user's mobile device;
comparing values in the parameter value table to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the parameter value table with avatar selection criteria.
47. The server of claim 42, wherein the server processor is further configured with software instructions to perform steps comprising:
requesting the user's mobile device to transmit sensor data.
48. The server of claim 42, wherein the server processor is further configured with software instructions to perform steps comprising requesting the user's mobile device to transmit sensor data in response to receiving a request for the user's avatar.
49. A server, comprising:
means for receiving sensor data from the user's mobile device;
means for comparing the received sensor data to avatar selection criteria;
means for selecting an avatar for display based upon the comparison of the sensor data with avatar selection criteria; and
means for including the selected avatar into the user's social webpage stored in the server memory.
50. The server of claim 49, further comprising:
means for receiving calendar data from the user's mobile device;
means for comparing the calendar data to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of the calendar data with avatar selection criteria.
51. The server of claim 49, further comprising:
means for receiving mobile device settings from the user's mobile device;
means for comparing the mobile device settings to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of the mobile device settings with avatar selection criteria.
52. The server of claim 49, further comprising:
means for receiving authorization level data from a requesting user's computing device;
means for comparing the authorization level data of the requesting user's computing device to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of the authorization level of the requesting user's computing device with avatar selection criteria.
53. The server of claim 49, further comprising means for receiving a parameter value table from the user's mobile device;
means for comparing values in the parameter value table to avatar selection criteria; and
means for selecting the avatar for display further based upon the comparison of the parameter value table with avatar selection criteria.
54. The server of claim 49, further comprising means for requesting the user's mobile device to transmit sensor data.
55. The server of claim 49, further comprising means for requesting the user's mobile device to transmit sensor data in response to receiving a request for the user's avatar.
56. A tangible storage medium having stored thereon server-executable software instructions configured to cause the server to perform steps comprising:
receiving sensor data from the user's mobile device;
comparing the received sensor data to avatar selection criteria;
selecting an avatar for display based upon the comparison of the sensor data with avatar selection criteria; and
including the selected avatar into the user's social webpage stored in the server memory.
57. The tangible storage medium of claim 56, wherein the stored server-executable software instructions are configured to cause the server to perform further steps comprising:
receiving calendar data from the user's mobile device;
comparing the calendar data to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the calendar data with avatar selection criteria.
58. The tangible storage medium of claim 56, wherein the stored server-executable software instructions are configured to cause the server to perform further steps comprising:
receiving mobile device settings from the user's mobile device;
comparing the mobile device settings to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the mobile device settings with avatar selection criteria.
59. The tangible storage medium of claim 56, wherein the stored server-executable software instructions are configured to cause the server to perform further steps comprising:
receiving authorization level data from a requesting user's computing device;
comparing the authorization level data of the requesting user's computing device to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the authorization level of the requesting user's computing device with avatar selection criteria.
60. The tangible storage medium of claim 56, wherein the stored server-executable software instructions are configured to cause the server to perform further steps comprising:
receiving a parameter value table from the user's mobile device;
comparing values in the parameter value table to avatar selection criteria; and
selecting the avatar for display further based upon the comparison of the parameter value table with avatar selection criteria.
61. The tangible storage medium of claim 56, wherein the stored server-executable software instructions are configured to cause the server to perform further steps comprising requesting the user's mobile device to transmit sensor data.
62. The tangible storage medium of claim 56, wherein the stored server-executable software instructions are configured to cause the server to perform further steps comprising requesting the user's mobile device to transmit sensor data in response to receiving a request for the user's avatar.
US12/127,349 2008-05-27 2008-05-27 Method and system for automatically updating avatar to indicate user's status Abandoned US20090300525A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US12/127,349 US20090300525A1 (en) 2008-05-27 2008-05-27 Method and system for automatically updating avatar to indicate user's status
KR1020107029200A KR20110014224A (en) 2008-05-27 2009-05-12 Method and system for automatically updating avatar status to indicate user's status
EP09755612A EP2294802A1 (en) 2008-05-27 2009-05-12 Method and system for automatically updating avatar status to indicate user's status
JP2011511693A JP5497015B2 (en) 2008-05-27 2009-05-12 Method and system for automatically updating avatar status to indicate user status
CN201610090062.5A CN105554311A (en) 2008-05-27 2009-05-12 Method and system for automatically updating avatar status to indicate user's status
CN2009801187120A CN102037716A (en) 2008-05-27 2009-05-12 Method and system for automatically updating avatar status to indicate user's status
PCT/US2009/043542 WO2009146250A1 (en) 2008-05-27 2009-05-12 Method and system for automatically updating avatar status to indicate user's status
JP2013229820A JP2014059894A (en) 2008-05-27 2013-11-05 Method and system for automatically updating avatar status to indicate user's status

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/127,349 US20090300525A1 (en) 2008-05-27 2008-05-27 Method and system for automatically updating avatar to indicate user's status

Publications (1)

Publication Number Publication Date
US20090300525A1 true US20090300525A1 (en) 2009-12-03

Family

ID=41056788

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/127,349 Abandoned US20090300525A1 (en) 2008-05-27 2008-05-27 Method and system for automatically updating avatar to indicate user's status

Country Status (6)

Country Link
US (1) US20090300525A1 (en)
EP (1) EP2294802A1 (en)
JP (2) JP5497015B2 (en)
KR (1) KR20110014224A (en)
CN (2) CN102037716A (en)
WO (1) WO2009146250A1 (en)

Cited By (286)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090157481A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a cohort-linked avatar attribute
US20090157625A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
US20090156907A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying an avatar
US20090157660A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems employing a cohort-linked avatar
US20090157751A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying an avatar
US20090157482A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for indicating behavior in a population cohort
US20090164549A1 (en) * 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for determining interest in a cohort-linked avatar
US20090172540A1 (en) * 2007-12-31 2009-07-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Population cohort-linked avatar
US20090254859A1 (en) * 2008-04-03 2009-10-08 Nokia Corporation Automated selection of avatar characteristics for groups
US20090275414A1 (en) * 2007-03-06 2009-11-05 Trion World Network, Inc. Apparatus, method, and computer readable media to perform transactions in association with participants interacting in a synthetic environment
US20090300513A1 (en) * 2008-06-02 2009-12-03 Nike, Inc. System and Method for Creating an Avatar
US20090309711A1 (en) * 2008-06-16 2009-12-17 Abhishek Adappa Methods and systems for configuring mobile devices using sensors
US20100046806A1 (en) * 2008-08-22 2010-02-25 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US20100050088A1 (en) * 2008-08-22 2010-02-25 Neustaedter Carman G Configuring a virtual world user-interface
US20100050253A1 (en) * 2008-08-22 2010-02-25 International Business Machines Corporation System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US20100062856A1 (en) * 2008-09-09 2010-03-11 Skype Limited User interface
US20100076870A1 (en) * 2008-03-13 2010-03-25 Fuhu. Inc Widgetized avatar and a method and system of virtual commerce including same
US20100094936A1 (en) * 2008-10-15 2010-04-15 Nokia Corporation Dynamic Layering of an Object
US20100106782A1 (en) * 2008-10-28 2010-04-29 Trion World Network, Inc. Persistent synthetic environment message notification
US20100115422A1 (en) * 2008-11-05 2010-05-06 At&T Intellectual Property I, L.P. System and method for conducting a communication exchange
US20100131864A1 (en) * 2008-11-21 2010-05-27 Bokor Brian R Avatar profile creation and linking in a virtual world
US20100134484A1 (en) * 2008-12-01 2010-06-03 Microsoft Corporation Three dimensional journaling environment
US20100153869A1 (en) * 2008-12-15 2010-06-17 International Business Machines Corporation System and method to visualize activities through the use of avatars
US20100153497A1 (en) * 2008-12-12 2010-06-17 Nortel Networks Limited Sharing expression information among conference participants
US20100153868A1 (en) * 2008-12-12 2010-06-17 International Business Machines Corporation System and method to modify avatar characteristics based on inferred conditions
US20100229107A1 (en) * 2009-03-06 2010-09-09 Trion World Networks, Inc. Cross-interface communication
US20100229106A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20100227688A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20100299747A1 (en) * 2009-05-21 2010-11-25 International Business Machines Corporation Identity verification in virtual worlds using encoded data
US20110022443A1 (en) * 2009-07-21 2011-01-27 Palo Alto Research Center Incorporated Employment inference from mobile device data
US20110028219A1 (en) * 2009-07-29 2011-02-03 Disney Enterprises, Inc. (Burbank, Ca) System and method for playsets using tracked objects and corresponding virtual worlds
US20110066423A1 (en) * 2009-09-17 2011-03-17 Avaya Inc. Speech-Recognition System for Location-Aware Applications
US20110084800A1 (en) * 2009-10-14 2011-04-14 Lee-Chun Ko Access Authorization Method And Apparatus For A Wireless Sensor Network
US20110087344A1 (en) * 1999-05-12 2011-04-14 Wilbert Quinc Murdock Smart golf software
US20110264728A1 (en) * 2010-04-23 2011-10-27 Research In Motion Limited Method and apparatus for electronically posting a graphic identifier to a plurality of servers
US20110298808A1 (en) * 2010-06-02 2011-12-08 Toyota Motor Engineering & Manufacturing North America, Inc. Animated Vehicle Attendance Systems
US20110298618A1 (en) * 2010-06-02 2011-12-08 Apple Inc. Remote User Status Indicators
US20110302513A1 (en) * 2008-11-24 2011-12-08 Fredrik Ademar Methods and apparatuses for flexible modification of user interfaces
EP2418123A1 (en) * 2010-08-11 2012-02-15 Valeo Schalter und Sensoren GmbH Method and system for supporting a driver of a vehicle in manoeuvring the vehicle on a driving route and portable communication device
US20120038667A1 (en) * 2010-08-11 2012-02-16 International Business Machines Corporation Replicating Changes Between Corresponding Objects
WO2012040392A3 (en) * 2010-09-21 2012-05-31 Cellepathy Ltd. System and method for sensor-based determination of user role, location, and/or state of one of more in-vehicle mobile devices and enforcement of usage thereof
US20120135804A1 (en) * 2010-06-07 2012-05-31 Daniel Bender Using affect within a gaming context
US8229079B2 (en) 2010-12-08 2012-07-24 Google Inc. Propagating user status information across computing resources
US20120246261A1 (en) * 2011-03-22 2012-09-27 Roh Yohan J Method and apparatus for managing sensor data and method and apparatus for analyzing sensor data
US20120270601A1 (en) * 2009-06-16 2012-10-25 Bran Ferren Intelligent graphics interface in a handheld wireless device
US20120293506A1 (en) * 2009-11-10 2012-11-22 Selex Sistemi Integrati S.P.A. Avatar-Based Virtual Collaborative Assistance
US20120327183A1 (en) * 2011-06-23 2012-12-27 Hiromitsu Fujii Information processing apparatus, information processing method, program, and server
US20120327091A1 (en) * 2010-03-08 2012-12-27 Nokia Corporation Gestural Messages in Social Phonebook
US20130024774A1 (en) * 2011-07-18 2013-01-24 At&T Intellectual Property I, Lp Method and apparatus for multi-experience adaptation of media content
US20130155169A1 (en) * 2011-12-14 2013-06-20 Verizon Corporate Services Group Inc. Method and system for providing virtual conferencing
US20130217350A1 (en) * 2012-02-16 2013-08-22 Research In Motion Corporation System and method for communicating presence status
WO2013151759A1 (en) * 2012-04-06 2013-10-10 Liveone Group, Ltd. A social media application for a media content providing platform
US20140082107A1 (en) * 2012-09-14 2014-03-20 Salesforce.Com, Inc. Computer implemented methods and apparatus for managing objectives in an organization in a social network environment
US20140115092A1 (en) * 2012-10-19 2014-04-24 At&T Intellectual Property I, L.P. Sensory communication sessions over a network
US8712788B1 (en) * 2013-01-30 2014-04-29 Nadira S. Morales-Pavon Method of publicly displaying a person's relationship status
WO2014049603A3 (en) * 2012-08-28 2014-05-30 Tata Consultancy Services Limited Dynamic selection of reliability of publishing data
US20140157152A1 (en) * 2008-10-16 2014-06-05 At&T Intellectual Property I, Lp System and method for distributing an avatar
US20140162711A1 (en) * 2012-12-06 2014-06-12 At&T Intellectual Property I, L.P. Collecting And Analyzing Data In A Distributed Sensor Network
US8775653B2 (en) 2009-06-01 2014-07-08 Trion Worlds, Inc. Web client data conversion for synthetic environment interaction
EP2793432A1 (en) * 2012-01-09 2014-10-22 Huawei Technologies Co., Ltd Method for displaying user state, display terminal and server
US8881057B2 (en) 2010-11-09 2014-11-04 Blackberry Limited Methods and apparatus to display mobile device contexts
US20140344687A1 (en) * 2013-05-16 2014-11-20 Lenitra Durham Techniques for Natural User Interface Input based on Context
US20140351324A1 (en) * 2010-07-08 2014-11-27 Sony Corporation Information processing apparatus, information processing method, and program
US20150038123A1 (en) * 2013-07-30 2015-02-05 Here Global B.V. Mobile Driving Condition Detection
WO2012078983A3 (en) * 2010-12-10 2015-03-19 Blueforce Development Corporation Decision support
US20150091936A1 (en) * 2013-09-27 2015-04-02 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device
US20150095310A1 (en) * 2013-09-27 2015-04-02 Here Global B.V. Method and apparatus for determining status updates associated with elements in a media item
US20150103019A1 (en) * 2011-04-22 2015-04-16 Joshua Michael Young Methods and Devices and Systems for Positioning Input Devices and Creating Control
US20150142887A1 (en) * 2012-05-21 2015-05-21 Zte Corporation Device, method and mobile terminal for updating mobile social network user state
US9060059B2 (en) 2010-09-10 2015-06-16 Google Inc. Call status sharing
US9084001B2 (en) 2011-07-18 2015-07-14 At&T Intellectual Property I, Lp Method and apparatus for multi-experience metadata translation of media content with metadata
CN104917882A (en) * 2015-04-30 2015-09-16 努比亚技术有限公司 Method and terminal for achieving cell electricity quantity display
US20150262132A1 (en) * 2014-03-13 2015-09-17 Microsoft Corporation User work schedule identification
US9146114B2 (en) 2012-06-22 2015-09-29 Google Inc. Presenting information for a current location or time
US20150279117A1 (en) * 2014-04-01 2015-10-01 Hallmark Cards, Incorporated Augmented Reality Appearance Enhancement
US9178983B2 (en) 2010-09-28 2015-11-03 E.Digital Corporation System and method for managing mobile communications
WO2015164951A1 (en) * 2014-05-01 2015-11-05 Abbas Mohamad Methods and systems relating to personalized evolving avatars
US20150319590A1 (en) * 2009-10-06 2015-11-05 Facebook, Inc. Sharing of location-based content item in social networking service
US9189076B2 (en) 2011-08-11 2015-11-17 At&T Intellectual Property I, Lp Method and apparatus for controlling multi-experience translation of media content
US20150358201A1 (en) * 2014-06-09 2015-12-10 Samsung Electronics Co., Ltd. Wearable electronic device, main electronic device, system and control method thereof
US20160006987A1 (en) * 2012-09-06 2016-01-07 Wenlong Li System and method for avatar creation and synchronization
US9237362B2 (en) 2011-08-11 2016-01-12 At&T Intellectual Property I, Lp Method and apparatus for multi-experience translation of media content with sensor sharing
US9294428B2 (en) 2012-01-18 2016-03-22 Kinectus, Llc Systems and methods for establishing communications between mobile device users
EP3015145A1 (en) * 2014-10-31 2016-05-04 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
US20160170572A1 (en) * 2011-06-13 2016-06-16 Sony Corporation Information processing device, information processing method, and computer program
US20160174027A1 (en) * 2013-03-15 2016-06-16 Athoc, Inc. Personnel Crisis Communications Management System
US20160188153A1 (en) * 2014-12-30 2016-06-30 PIQPIQ, Inc. Social messaging system for real-time selection and sorting of photo and video content
US20160266857A1 (en) * 2013-12-12 2016-09-15 Samsung Electronics Co., Ltd. Method and apparatus for displaying image information
US9460448B2 (en) 2010-03-20 2016-10-04 Nimbelink Corp. Environmental monitoring system which leverages a social networking service to deliver alerts to mobile phones or devices
US20160292903A1 (en) * 2014-09-24 2016-10-06 Intel Corporation Avatar audio communication systems and techniques
EP2974405A4 (en) * 2013-03-15 2016-11-16 Samsung Electronics Co Ltd Communication system with identification management and method of operation thereof
US9501782B2 (en) 2010-03-20 2016-11-22 Arthur Everett Felgate Monitoring system
US9503516B2 (en) 2014-08-06 2016-11-22 Google Technology Holdings LLC Context-based contact notification
US20160343063A1 (en) * 2015-05-18 2016-11-24 Ebay Inc. Replaced device handler
US9509787B2 (en) 2012-01-09 2016-11-29 Huawei Technologies Co., Ltd. User status displaying method, and server
CN106169227A (en) * 2016-08-31 2016-11-30 广东小天才科技有限公司 A kind of method and wearable device reminded of going on a journey
US9568993B2 (en) 2008-01-09 2017-02-14 International Business Machines Corporation Automated avatar mood effects in a virtual world
US20170091005A1 (en) * 2009-09-23 2017-03-30 Microsoft Technology Licensing, Llc Message communication of sensor and other data
US9691115B2 (en) 2012-06-21 2017-06-27 Cellepathy Inc. Context determination using access points in transportation and other scenarios
US9787623B2 (en) 2005-12-14 2017-10-10 Facebook, Inc. Automatically providing a communication based on location information for a user of a social networking system
US9800716B2 (en) 2010-09-21 2017-10-24 Cellepathy Inc. Restricting mobile device usage
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US20180115746A1 (en) * 2015-11-17 2018-04-26 Tencent Technology (Shenzhen) Company Limited Video calling method and apparatus
WO2018081235A1 (en) * 2016-10-27 2018-05-03 Wal-Mart Stores, Inc. Systems and methods for adjusting the display quality of an avatar during an online or virtual shopping session
US20180144775A1 (en) * 2016-11-18 2018-05-24 Facebook, Inc. Methods and Systems for Tracking Media Effects in a Media Effect Index
US20180157377A1 (en) * 2015-05-12 2018-06-07 Samsung Electronics Co., Ltd. Electronic device and method for providing graphic user interface therefor
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US10004056B2 (en) 2012-03-01 2018-06-19 Microsoft Technology Licensing, Llc Requesting a location of a user
US20180285645A1 (en) * 2017-01-12 2018-10-04 International Business Machines Corporation Setting a personal status using augmented reality
WO2018200042A1 (en) * 2017-04-27 2018-11-01 Snap Inc. Location-based virtual avatars
US10149116B1 (en) 2017-01-27 2018-12-04 Allstate Insurance Company Early notification of driving status to a mobile device
US10231668B2 (en) * 2015-11-13 2019-03-19 International Business Machines Corporation Instant messaging status reporting based on smart watch activity
US10304229B1 (en) * 2017-11-21 2019-05-28 International Business Machines Corporation Cognitive multi-layered real-time visualization of a user's sensed information
US10319376B2 (en) * 2009-09-17 2019-06-11 Avaya Inc. Geo-spatial event processing
US10348662B2 (en) * 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US20190236622A1 (en) * 2018-01-30 2019-08-01 Robert Swanson Systems and methods for utilizing crowdsourcing to implement actions
US10438394B2 (en) * 2017-03-02 2019-10-08 Colopl, Inc. Information processing method, virtual space delivering system and apparatus therefor
US10440129B2 (en) * 2009-12-14 2019-10-08 At&T Intellectual Property I, L.P. Unified location and presence communication across real and virtual worlds
US10460085B2 (en) 2008-03-13 2019-10-29 Mattel, Inc. Tablet computer
US20190379760A1 (en) * 2017-01-25 2019-12-12 International Business Machines Corporation Targeted profile picture selection
CN110603539A (en) * 2017-02-07 2019-12-20 Iot控股公司 System and method for preventing monitoring and protecting privacy in virtual reality
TWI680400B (en) * 2014-10-31 2019-12-21 南韓商三星電子股份有限公司 Device and method of managing user information based on image
US10554908B2 (en) 2016-12-05 2020-02-04 Facebook, Inc. Media effect application
US10775232B2 (en) 2017-03-13 2020-09-15 Omron Corporation Environmental sensor
US10802683B1 (en) 2017-02-16 2020-10-13 Cisco Technology, Inc. Method, system and computer program product for changing avatars in a communication application display
US10824795B2 (en) * 2016-06-21 2020-11-03 Fernando J. Pinho Indoor positioning and recording system
US10843078B2 (en) 2010-06-07 2020-11-24 Affectiva, Inc. Affect usage within a gaming context
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US10861170B1 (en) 2018-11-30 2020-12-08 Snap Inc. Efficient human pose tracking in videos
US10867163B1 (en) 2016-11-29 2020-12-15 Facebook, Inc. Face detection for video calls
US10872451B2 (en) 2018-10-31 2020-12-22 Snap Inc. 3D avatar rendering
US10880246B2 (en) 2016-10-24 2020-12-29 Snap Inc. Generating and displaying customized avatars in electronic messages
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US10896534B1 (en) 2018-09-19 2021-01-19 Snap Inc. Avatar style transformation using neural networks
US10895964B1 (en) 2018-09-25 2021-01-19 Snap Inc. Interface to display shared user groups
US10902659B2 (en) 2018-09-19 2021-01-26 International Business Machines Corporation Intelligent photograph overlay in an internet of things (IoT) computing environment
US10904181B2 (en) 2018-09-28 2021-01-26 Snap Inc. Generating customized graphics having reactions to electronic message content
US10902661B1 (en) 2018-11-28 2021-01-26 Snap Inc. Dynamic composite user identifier
US10911387B1 (en) 2019-08-12 2021-02-02 Snap Inc. Message reminder interface
US10936157B2 (en) 2017-11-29 2021-03-02 Snap Inc. Selectable item including a customized graphic for an electronic messaging application
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US10949648B1 (en) 2018-01-23 2021-03-16 Snap Inc. Region-based stabilized face tracking
US10951562B2 (en) 2017-01-18 2021-03-16 Snap. Inc. Customized contextual media content item generation
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US10979752B1 (en) * 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
USD916872S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
US10984575B2 (en) 2019-02-06 2021-04-20 Snap Inc. Body pose estimation
USD916811S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916809S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916810S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
US10984569B2 (en) 2016-06-30 2021-04-20 Snap Inc. Avatar based ideogram generation
USD916871S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
US10991395B1 (en) 2014-02-05 2021-04-27 Snap Inc. Method for real time video processing involving changing a color of an object on a human face in a video
US10992619B2 (en) 2019-04-30 2021-04-27 Snap Inc. Messaging system with avatar generation
US11010022B2 (en) 2019-02-06 2021-05-18 Snap Inc. Global event-based avatar
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US11030789B2 (en) 2017-10-30 2021-06-08 Snap Inc. Animated chat presence
US11036781B1 (en) 2020-01-30 2021-06-15 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11039270B2 (en) 2019-03-28 2021-06-15 Snap Inc. Points of interest in a location sharing system
US11036989B1 (en) 2019-12-11 2021-06-15 Snap Inc. Skeletal tracking using previous frames
US20210183397A1 (en) * 2018-04-20 2021-06-17 Facebook, Inc. Multiple Wake Words for Systems with Multiple Smart Assistants
US11048916B2 (en) 2016-03-31 2021-06-29 Snap Inc. Automated avatar generation
US20210200426A1 (en) * 2019-12-27 2021-07-01 Snap Inc. Expressive user icons in a map-based messaging system interface
US11055514B1 (en) 2018-12-14 2021-07-06 Snap Inc. Image face manipulation
US20210211487A1 (en) * 2020-01-08 2021-07-08 LINE Plus Corporation Method and system for sharing avatars through instant messaging application
US11063891B2 (en) 2019-12-03 2021-07-13 Snap Inc. Personalized avatar notification
US11069103B1 (en) 2017-04-20 2021-07-20 Snap Inc. Customized user interface for electronic communications
US11070661B2 (en) 2010-09-21 2021-07-20 Cellepathy Inc. Restricting mobile device usage
US11074675B2 (en) 2018-07-31 2021-07-27 Snap Inc. Eye texture inpainting
US11080917B2 (en) 2019-09-30 2021-08-03 Snap Inc. Dynamic parameterized user avatar stories
US20210243503A1 (en) * 2020-01-30 2021-08-05 Snap Inc Selecting avatars to be included in the video being generated on demand
US11102020B2 (en) * 2017-12-27 2021-08-24 Sharp Kabushiki Kaisha Information processing device, information processing system, and information processing method
US11100311B2 (en) 2016-10-19 2021-08-24 Snap Inc. Neural networks for facial modeling
US11103795B1 (en) 2018-10-31 2021-08-31 Snap Inc. Game drawer
US11122094B2 (en) 2017-07-28 2021-09-14 Snap Inc. Software application manager for messaging applications
US11120597B2 (en) 2017-10-26 2021-09-14 Snap Inc. Joint audio-video facial animation system
US11120601B2 (en) 2018-02-28 2021-09-14 Snap Inc. Animated expressive icon
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11128586B2 (en) 2019-12-09 2021-09-21 Snap Inc. Context sensitive avatar captions
US11140515B1 (en) 2019-12-30 2021-10-05 Snap Inc. Interfaces for relative device positioning
US11158174B2 (en) 2019-07-12 2021-10-26 Carrier Corporation Security system with distributed audio and video sources
US11166123B1 (en) 2019-03-28 2021-11-02 Snap Inc. Grouped transmission of location data in a location sharing system
CN113632497A (en) * 2019-03-28 2021-11-09 斯纳普公司 Generating personalized map interfaces with enhanced icons
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
US11178359B2 (en) * 2017-09-26 2021-11-16 Hewlett-Packard Development Company, L.P. Electronic device and generating conference call participants identifications
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
US11188190B2 (en) 2019-06-28 2021-11-30 Snap Inc. Generating animation overlays in a communication session
US11189102B2 (en) * 2017-12-22 2021-11-30 Samsung Electronics Co., Ltd. Electronic device for displaying object for augmented reality and operation method therefor
US11189070B2 (en) 2018-09-28 2021-11-30 Snap Inc. System and method of generating targeted user lists using customizable avatar characteristics
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11217020B2 (en) 2020-03-16 2022-01-04 Snap Inc. 3D cutout image modification
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11229849B2 (en) 2012-05-08 2022-01-25 Snap Inc. System and method for generating and displaying avatars
US11245658B2 (en) 2018-09-28 2022-02-08 Snap Inc. System and method of generating private notifications between users in a communication session
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11284144B2 (en) 2020-01-30 2022-03-22 Snap Inc. Video generation system to render frames on demand using a fleet of GPUs
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
US11320969B2 (en) 2019-09-16 2022-05-03 Snap Inc. Messaging system with battery level sharing
US11356720B2 (en) 2020-01-30 2022-06-07 Snap Inc. Video generation system to render frames on demand
US11360733B2 (en) 2020-09-10 2022-06-14 Snap Inc. Colocated shared augmented reality without shared backend
US20220210107A1 (en) * 2020-12-31 2022-06-30 Snap Inc. Messaging user interface element with reminders
US11380215B2 (en) * 2018-08-30 2022-07-05 Kyndryl, Inc. Reward-based ecosystem for tracking nutritional consumption
US11411895B2 (en) 2017-11-29 2022-08-09 Snap Inc. Generating aggregated media content items for a group of users in an electronic messaging application
US11423490B2 (en) * 2014-06-27 2022-08-23 Intel Corporation Socially and contextually appropriate recommendation systems
US11425068B2 (en) 2009-02-03 2022-08-23 Snap Inc. Interactive avatar in messaging environment
US11425062B2 (en) 2019-09-27 2022-08-23 Snap Inc. Recommended content viewed by friends
US11438341B1 (en) 2016-10-10 2022-09-06 Snap Inc. Social media post subscribe requests for buffer user accounts
US11450051B2 (en) 2020-11-18 2022-09-20 Snap Inc. Personalized avatar real-time motion capture
US11452939B2 (en) 2020-09-21 2022-09-27 Snap Inc. Graphical marker generation system for synchronizing users
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11455081B2 (en) 2019-08-05 2022-09-27 Snap Inc. Message thread prioritization interface
US11460974B1 (en) 2017-11-28 2022-10-04 Snap Inc. Content discovery refresh
US11507733B2 (en) * 2011-08-18 2022-11-22 Pfaqutruma Research Llc System and methods of virtual world interaction
US11516173B1 (en) 2018-12-26 2022-11-29 Snap Inc. Message composition interface
US11520479B2 (en) 2016-09-01 2022-12-06 PIQPIQ, Inc. Mass media presentations with synchronized audio reactions
US11544883B1 (en) 2017-01-16 2023-01-03 Snap Inc. Coded vision system
US11544885B2 (en) 2021-03-19 2023-01-03 Snap Inc. Augmented reality experience based on physical items
US11543939B2 (en) 2020-06-08 2023-01-03 Snap Inc. Encoded image based messaging system
US11562548B2 (en) 2021-03-22 2023-01-24 Snap Inc. True size eyewear in real time
US11571623B2 (en) 2008-06-02 2023-02-07 Nike, Inc. System and method for creating an avatar
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation
US11580682B1 (en) 2020-06-30 2023-02-14 Snap Inc. Messaging system with augmented reality makeup
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US11615592B2 (en) 2020-10-27 2023-03-28 Snap Inc. Side-by-side character animation from realtime 3D body motion capture
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US11636654B2 (en) 2021-05-19 2023-04-25 Snap Inc. AR-based connected portal shopping
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11651539B2 (en) 2020-01-30 2023-05-16 Snap Inc. System for generating media content items on demand
US11663792B2 (en) 2021-09-08 2023-05-30 Snap Inc. Body fitted accessory with physics simulation
US11662900B2 (en) 2016-05-31 2023-05-30 Snap Inc. Application control using a gesture based trigger
US11660022B2 (en) 2020-10-27 2023-05-30 Snap Inc. Adaptive skeletal joint smoothing
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11676220B2 (en) 2018-04-20 2023-06-13 Meta Platforms, Inc. Processing multimodal user input for assistant systems
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11676199B2 (en) 2019-06-28 2023-06-13 Snap Inc. Generating customizable avatar outfits
US11683280B2 (en) 2020-06-10 2023-06-20 Snap Inc. Messaging system including an external-resource dock and drawer
US11704878B2 (en) 2017-01-09 2023-07-18 Snap Inc. Surface aware lens
US11715042B1 (en) 2018-04-20 2023-08-01 Meta Platforms Technologies, Llc Interpretability of deep reinforcement learning models in assistant systems
US11734894B2 (en) 2020-11-18 2023-08-22 Snap Inc. Real-time motion transfer for prosthetic limbs
US11734959B2 (en) 2021-03-16 2023-08-22 Snap Inc. Activating hands-free mode on mirroring device
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
US11748958B2 (en) 2021-12-07 2023-09-05 Snap Inc. Augmented reality unboxing experience
US11748931B2 (en) 2020-11-18 2023-09-05 Snap Inc. Body animation sharing and remixing
US11763481B2 (en) 2021-10-20 2023-09-19 Snap Inc. Mirror-based augmented reality experience
US11790614B2 (en) 2021-10-11 2023-10-17 Snap Inc. Inferring intent from pose and speech input
US11790531B2 (en) 2021-02-24 2023-10-17 Snap Inc. Whole body segmentation
US11798201B2 (en) 2021-03-16 2023-10-24 Snap Inc. Mirroring device with whole-body outfits
US11798238B2 (en) 2021-09-14 2023-10-24 Snap Inc. Blending body mesh into external mesh
US11809633B2 (en) 2021-03-16 2023-11-07 Snap Inc. Mirroring device with pointing based navigation
US11818286B2 (en) 2020-03-30 2023-11-14 Snap Inc. Avatar recommendation and reply
US11823346B2 (en) 2022-01-17 2023-11-21 Snap Inc. AR body part tracking system
US11830209B2 (en) 2017-05-26 2023-11-28 Snap Inc. Neural network-based image stream modification
US11836866B2 (en) 2021-09-20 2023-12-05 Snap Inc. Deforming real-world object using an external mesh
US11836862B2 (en) 2021-10-11 2023-12-05 Snap Inc. External mesh with vertex attributes
US11842729B1 (en) * 2019-05-08 2023-12-12 Apple Inc. Method and device for presenting a CGR environment based on audio data and lyric data
US11854069B2 (en) 2021-07-16 2023-12-26 Snap Inc. Personalized try-on ads
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11863513B2 (en) 2020-08-31 2024-01-02 Snap Inc. Media content playback and comments management
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11870745B1 (en) 2022-06-28 2024-01-09 Snap Inc. Media gallery sharing and management
US11875439B2 (en) 2018-04-18 2024-01-16 Snap Inc. Augmented expression system
US11880947B2 (en) 2021-12-21 2024-01-23 Snap Inc. Real-time upper-body garment exchange
US11888795B2 (en) 2020-09-21 2024-01-30 Snap Inc. Chats with micro sound clips
US11887260B2 (en) 2021-12-30 2024-01-30 Snap Inc. AR position indicator
US11886473B2 (en) 2018-04-20 2024-01-30 Meta Platforms, Inc. Intent identification for agent matching by assistant systems
US11893166B1 (en) 2022-11-08 2024-02-06 Snap Inc. User avatar movement control using an augmented reality eyewear device
US11900506B2 (en) 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11908083B2 (en) 2021-08-31 2024-02-20 Snap Inc. Deforming custom mesh based on body mesh
US11908243B2 (en) 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices
US11910269B2 (en) 2020-09-25 2024-02-20 Snap Inc. Augmented reality content items including user avatar to share location
US11922010B2 (en) 2020-06-08 2024-03-05 Snap Inc. Providing contextual information with keyboard interface for messaging system
US11928783B2 (en) 2021-12-30 2024-03-12 Snap Inc. AR position and orientation along a plane
US11941227B2 (en) 2021-06-30 2024-03-26 Snap Inc. Hybrid search system for customizable media
US11956190B2 (en) 2020-09-11 2024-04-09 Snap Inc. Messaging system with a carousel of related entities

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101677718B1 (en) 2010-04-14 2016-12-06 삼성전자주식회사 Method and Apparatus for Processing Virtual World
US10572721B2 (en) 2010-08-09 2020-02-25 Nike, Inc. Monitoring fitness using a mobile device
US8422948B2 (en) 2010-09-17 2013-04-16 Research In Motion Limited Mobile wireless communications device including wireless-based availability detector and associated methods
EP2432201B1 (en) * 2010-09-17 2019-01-02 BlackBerry Limited Mobile wireless communications device including wireless-based availability detector and associated methods
US9299110B2 (en) 2011-10-19 2016-03-29 Facebook, Inc. Periodic ambient waveform analysis for dynamic device configuration
EP2690847A1 (en) * 2012-07-27 2014-01-29 Constantin Medien AG Virtual assistant for a telecommunication system
US8995911B2 (en) * 2012-12-20 2015-03-31 Nokia Technologies Oy Apparatus and associated methods
WO2015079865A1 (en) * 2013-11-27 2015-06-04 シャープ株式会社 Input device, communication information identification method, processing device, display device, program, and recording medium
US9575534B2 (en) 2014-03-18 2017-02-21 Empire Technology Development Llc Device usage message generator indicative of power consumption of an electronic device
US10437448B2 (en) * 2014-07-08 2019-10-08 Honeywell International Inc. System and method for auto-configuration of devices in building information model
CN107005585B (en) * 2014-11-06 2020-10-02 Iot控股公司 Method and system for event mode guided mobile content services
CN107430273B (en) * 2015-03-12 2022-11-15 依视路国际公司 Method for customizing a mounted sensing device
US10659479B2 (en) * 2015-03-27 2020-05-19 Mcafee, Llc Determination of sensor usage
EP3295411A1 (en) * 2015-09-09 2018-03-21 Google LLC Systems and methods for providing content
JP5925373B1 (en) * 2015-10-09 2016-05-25 Lykaon株式会社 Communication support system
CN107038201A (en) 2016-12-26 2017-08-11 阿里巴巴集团控股有限公司 Display methods, device, terminal and the server of personal homepage
JP6181330B1 (en) 2017-02-03 2017-08-16 株式会社 ディー・エヌ・エー System, method and program for managing avatars
CN107547739A (en) * 2017-08-28 2018-01-05 广东小天才科技有限公司 A kind of method, apparatus, equipment and the storage medium of mobile terminal automatic shutdown
JP2021099538A (en) * 2018-03-30 2021-07-01 ソニーグループ株式会社 Information processing equipment, information processing method and program
CN109167878A (en) * 2018-08-23 2019-01-08 三星电子(中国)研发中心 A kind of driving method, system and the device of avatar model
CN110123285A (en) * 2019-05-20 2019-08-16 浙江和也健康科技有限公司 A kind of Domestic health-care monitoring data visualization system
CN110123286A (en) * 2019-05-20 2019-08-16 浙江和也健康科技有限公司 A kind of instantiated system of sleep monitor data
JP7190052B2 (en) * 2019-08-20 2022-12-14 日本たばこ産業株式会社 COMMUNICATION SUPPORT METHOD, PROGRAM AND COMMUNICATION SERVER
KR102256383B1 (en) * 2020-05-18 2021-05-25 권영훈 system for chatbot service that provides response information and transforms the appearance of characters in consideration of the user's desired time
WO2023096455A1 (en) * 2021-11-29 2023-06-01 한국과학기술원 Electronic device for managing work between members and operation method thereof
JP2023104179A (en) * 2022-01-17 2023-07-28 株式会社穴熊 Current status presentation system, current status presentation program, and current status presentation method
WO2024042687A1 (en) * 2022-08-25 2024-02-29 株式会社ソニー・インタラクティブエンタテインメント Information processing device, information processing system, method for controlling information processing device, and program

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030104819A1 (en) * 2001-12-05 2003-06-05 Intel Corporation Automatically updating presence information
US20040002634A1 (en) * 2002-06-28 2004-01-01 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20040179038A1 (en) * 2003-03-03 2004-09-16 Blattner Patrick D. Reactive avatars
US20050044143A1 (en) * 2003-08-19 2005-02-24 Logitech Europe S.A. Instant messenger presence and identity management
US20050091269A1 (en) * 2003-10-24 2005-04-28 Gerber Robert H. System and method for preference application installation and execution
US20060098027A1 (en) * 2004-11-09 2006-05-11 Rice Myra L Method and apparatus for providing call-related personal images responsive to supplied mood data
US20070073799A1 (en) * 2005-09-29 2007-03-29 Conopco, Inc., D/B/A Unilever Adaptive user profiling on mobile devices
US20070074114A1 (en) * 2005-09-29 2007-03-29 Conopco, Inc., D/B/A Unilever Automated dialogue interface
US20070174389A1 (en) * 2006-01-10 2007-07-26 Aol Llc Indicating Recent Content Publication Activity By A User
US20080184170A1 (en) * 2007-01-16 2008-07-31 Shape Innovations Inc Systems and methods for customized instant messaging application for displaying status of measurements from sensors
US20080201638A1 (en) * 2007-02-15 2008-08-21 Yahoo! Inc. Context avatar
US20080240384A1 (en) * 2007-03-29 2008-10-02 Lalitha Suryanarayana Methods and apparatus to provide presence information
US20080269958A1 (en) * 2007-04-26 2008-10-30 Ford Global Technologies, Llc Emotive advisory system and method
US20090055484A1 (en) * 2007-08-20 2009-02-26 Thanh Vuong System and method for representation of electronic mail users using avatars
US7752268B2 (en) * 2003-09-25 2010-07-06 Oracle America, Inc. Method and system for presence state assignment based on schedule information in an instant messaging system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000048224A (en) * 1998-07-31 2000-02-18 Sony Corp Device and method for information processing and provided medium
JP2004185437A (en) * 2002-12-04 2004-07-02 Nippon Hoso Kyokai <Nhk> Program, server, client and method for body information reflecting chatting
CN1757057A (en) * 2003-03-03 2006-04-05 美国在线服务公司 Using avatars to communicate
JP2004287789A (en) * 2003-03-20 2004-10-14 Fuji Xerox Co Ltd Processor for document file to which access right is set, image formation medium, processing method and its program
JP4218830B2 (en) * 2003-11-18 2009-02-04 ソニー・エリクソン・モバイルコミュニケーションズ株式会社 Portable information device
JP2005151483A (en) * 2003-11-20 2005-06-09 Hitachi Ltd Communication terminal device
DE602004025897D1 (en) * 2003-12-09 2010-04-22 Samsung Electronics Co Ltd A method of hitting an alert with avatars in a cell phone
JP4227043B2 (en) * 2004-03-04 2009-02-18 株式会社野村総合研究所 Avatar control system
KR100809585B1 (en) * 2004-12-21 2008-03-07 삼성전자주식회사 Device and method for processing schedule-related event in wireless terminal
JP4054806B2 (en) * 2005-01-12 2008-03-05 株式会社エヌ・ティ・ティ・ドコモ Presence server, mobile device, presence information management system, and presence information management method
JP2006350416A (en) * 2005-06-13 2006-12-28 Tecmo Ltd Information retrieval system using avatar
JP2007058379A (en) * 2005-08-23 2007-03-08 Hotpot:Kk Content providing system, server device, program, and recording medium
JP2007172372A (en) * 2005-12-22 2007-07-05 Sharp Corp Mediation device, communication device, mediation program, and communication program

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030104819A1 (en) * 2001-12-05 2003-06-05 Intel Corporation Automatically updating presence information
US20040002634A1 (en) * 2002-06-28 2004-01-01 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US6817979B2 (en) * 2002-06-28 2004-11-16 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20040179038A1 (en) * 2003-03-03 2004-09-16 Blattner Patrick D. Reactive avatars
US20050044143A1 (en) * 2003-08-19 2005-02-24 Logitech Europe S.A. Instant messenger presence and identity management
US7752268B2 (en) * 2003-09-25 2010-07-06 Oracle America, Inc. Method and system for presence state assignment based on schedule information in an instant messaging system
US20050091269A1 (en) * 2003-10-24 2005-04-28 Gerber Robert H. System and method for preference application installation and execution
US20060098027A1 (en) * 2004-11-09 2006-05-11 Rice Myra L Method and apparatus for providing call-related personal images responsive to supplied mood data
US20070074114A1 (en) * 2005-09-29 2007-03-29 Conopco, Inc., D/B/A Unilever Automated dialogue interface
US20070073799A1 (en) * 2005-09-29 2007-03-29 Conopco, Inc., D/B/A Unilever Adaptive user profiling on mobile devices
US20070174389A1 (en) * 2006-01-10 2007-07-26 Aol Llc Indicating Recent Content Publication Activity By A User
US20080184170A1 (en) * 2007-01-16 2008-07-31 Shape Innovations Inc Systems and methods for customized instant messaging application for displaying status of measurements from sensors
US20080201638A1 (en) * 2007-02-15 2008-08-21 Yahoo! Inc. Context avatar
US20080240384A1 (en) * 2007-03-29 2008-10-02 Lalitha Suryanarayana Methods and apparatus to provide presence information
US20080269958A1 (en) * 2007-04-26 2008-10-30 Ford Global Technologies, Llc Emotive advisory system and method
US20090055484A1 (en) * 2007-08-20 2009-02-26 Thanh Vuong System and method for representation of electronic mail users using avatars

Cited By (535)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110087344A1 (en) * 1999-05-12 2011-04-14 Wilbert Quinc Murdock Smart golf software
US9787623B2 (en) 2005-12-14 2017-10-10 Facebook, Inc. Automatically providing a communication based on location information for a user of a social networking system
US10826858B2 (en) 2007-02-28 2020-11-03 Facebook, Inc. Automatically providing a communication based on location information for a user of a social networking system
US10225223B2 (en) 2007-02-28 2019-03-05 Facebook, Inc. Automatically providing a communication based on location information for a user of a social networking system
US8898325B2 (en) 2007-03-06 2014-11-25 Trion Worlds, Inc. Apparatus, method, and computer readable media to perform transactions in association with participants interacting in a synthetic environment
US20090275414A1 (en) * 2007-03-06 2009-11-05 Trion World Network, Inc. Apparatus, method, and computer readable media to perform transactions in association with participants interacting in a synthetic environment
US9495684B2 (en) 2007-12-13 2016-11-15 The Invention Science Fund I, Llc Methods and systems for indicating behavior in a population cohort
US20090157625A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for identifying an avatar-linked population cohort
US20090157751A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying an avatar
US9211077B2 (en) * 2007-12-13 2015-12-15 The Invention Science Fund I, Llc Methods and systems for specifying an avatar
US8615479B2 (en) 2007-12-13 2013-12-24 The Invention Science Fund I, Llc Methods and systems for indicating behavior in a population cohort
US20090157660A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems employing a cohort-linked avatar
US20090156907A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying an avatar
US20090157482A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for indicating behavior in a population cohort
US20090157481A1 (en) * 2007-12-13 2009-06-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for specifying a cohort-linked avatar attribute
US20090164549A1 (en) * 2007-12-20 2009-06-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Methods and systems for determining interest in a cohort-linked avatar
US9418368B2 (en) 2007-12-20 2016-08-16 Invention Science Fund I, Llc Methods and systems for determining interest in a cohort-linked avatar
US9775554B2 (en) 2007-12-31 2017-10-03 Invention Science Fund I, Llc Population cohort-linked avatar
US20090172540A1 (en) * 2007-12-31 2009-07-02 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Population cohort-linked avatar
US9568993B2 (en) 2008-01-09 2017-02-14 International Business Machines Corporation Automated avatar mood effects in a virtual world
US20100076870A1 (en) * 2008-03-13 2010-03-25 Fuhu. Inc Widgetized avatar and a method and system of virtual commerce including same
US10460085B2 (en) 2008-03-13 2019-10-29 Mattel, Inc. Tablet computer
US20090254859A1 (en) * 2008-04-03 2009-10-08 Nokia Corporation Automated selection of avatar characteristics for groups
US8832552B2 (en) * 2008-04-03 2014-09-09 Nokia Corporation Automated selection of avatar characteristics for groups
US11571623B2 (en) 2008-06-02 2023-02-07 Nike, Inc. System and method for creating an avatar
US10022631B2 (en) * 2008-06-02 2018-07-17 Nike, Inc. System and method for creating an avatar
US10905959B2 (en) 2008-06-02 2021-02-02 Nike, Inc. System and method for creating an avatar
US10569177B2 (en) 2008-06-02 2020-02-25 Nike, Inc. System and method for creating an avatar
US11235246B2 (en) 2008-06-02 2022-02-01 Nike, Inc. System and method for creating an avatar
US11896906B2 (en) 2008-06-02 2024-02-13 Nike, Inc. System and method for creating an avatar
US20090300513A1 (en) * 2008-06-02 2009-12-03 Nike, Inc. System and Method for Creating an Avatar
US20090309711A1 (en) * 2008-06-16 2009-12-17 Abhishek Adappa Methods and systems for configuring mobile devices using sensors
US8040233B2 (en) * 2008-06-16 2011-10-18 Qualcomm Incorporated Methods and systems for configuring mobile devices using sensors
US20100050253A1 (en) * 2008-08-22 2010-02-25 International Business Machines Corporation System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US8448230B2 (en) 2008-08-22 2013-05-21 International Business Machines Corporation System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US11269979B2 (en) * 2008-08-22 2022-03-08 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US9147060B2 (en) 2008-08-22 2015-09-29 International Business Machines Corporation System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US11080377B2 (en) * 2008-08-22 2021-08-03 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US20100046806A1 (en) * 2008-08-22 2010-02-25 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US10679749B2 (en) * 2008-08-22 2020-06-09 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US11170083B2 (en) * 2008-08-22 2021-11-09 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US10776468B2 (en) 2008-08-22 2020-09-15 Daedalus Blue Llc System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US20100050088A1 (en) * 2008-08-22 2010-02-25 Neustaedter Carman G Configuring a virtual world user-interface
US10013541B2 (en) 2008-08-22 2018-07-03 International Business Machines Corporation System and method for real world biometric analytics through the use of a multimodal biometric analytic wallet
US9223469B2 (en) * 2008-08-22 2015-12-29 Intellectual Ventures Fund 83 Llc Configuring a virtual world user-interface
US20180082151A1 (en) * 2008-08-22 2018-03-22 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US20180096228A1 (en) * 2008-08-22 2018-04-05 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US20180096227A1 (en) * 2008-08-22 2018-04-05 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US9056250B2 (en) * 2008-09-09 2015-06-16 Skype Systems and methods for handling communication events in a computer gaming system
US20100062856A1 (en) * 2008-09-09 2010-03-11 Skype Limited User interface
US20100094936A1 (en) * 2008-10-15 2010-04-15 Nokia Corporation Dynamic Layering of an Object
US20140157152A1 (en) * 2008-10-16 2014-06-05 At&T Intellectual Property I, Lp System and method for distributing an avatar
US11112933B2 (en) 2008-10-16 2021-09-07 At&T Intellectual Property I, L.P. System and method for distributing an avatar
US10055085B2 (en) * 2008-10-16 2018-08-21 At&T Intellectual Property I, Lp System and method for distributing an avatar
US20100106782A1 (en) * 2008-10-28 2010-04-29 Trion World Network, Inc. Persistent synthetic environment message notification
US8626863B2 (en) * 2008-10-28 2014-01-07 Trion Worlds, Inc. Persistent synthetic environment message notification
US20100115422A1 (en) * 2008-11-05 2010-05-06 At&T Intellectual Property I, L.P. System and method for conducting a communication exchange
US8589803B2 (en) * 2008-11-05 2013-11-19 At&T Intellectual Property I, L.P. System and method for conducting a communication exchange
US20100131864A1 (en) * 2008-11-21 2010-05-27 Bokor Brian R Avatar profile creation and linking in a virtual world
US20110302513A1 (en) * 2008-11-24 2011-12-08 Fredrik Ademar Methods and apparatuses for flexible modification of user interfaces
US20100134484A1 (en) * 2008-12-01 2010-06-03 Microsoft Corporation Three dimensional journaling environment
US20100153868A1 (en) * 2008-12-12 2010-06-17 International Business Machines Corporation System and method to modify avatar characteristics based on inferred conditions
US9741147B2 (en) * 2008-12-12 2017-08-22 International Business Machines Corporation System and method to modify avatar characteristics based on inferred conditions
US20100153497A1 (en) * 2008-12-12 2010-06-17 Nortel Networks Limited Sharing expression information among conference participants
US10244012B2 (en) 2008-12-15 2019-03-26 International Business Machines Corporation System and method to visualize activities through the use of avatars
US9075901B2 (en) * 2008-12-15 2015-07-07 International Business Machines Corporation System and method to visualize activities through the use of avatars
US20100153869A1 (en) * 2008-12-15 2010-06-17 International Business Machines Corporation System and method to visualize activities through the use of avatars
US11425068B2 (en) 2009-02-03 2022-08-23 Snap Inc. Interactive avatar in messaging environment
US8661073B2 (en) 2009-03-06 2014-02-25 Trion Worlds, Inc. Synthetic environment character data sharing
US8694585B2 (en) 2009-03-06 2014-04-08 Trion Worlds, Inc. Cross-interface communication
US8657686B2 (en) 2009-03-06 2014-02-25 Trion Worlds, Inc. Synthetic environment character data sharing
US20100229107A1 (en) * 2009-03-06 2010-09-09 Trion World Networks, Inc. Cross-interface communication
US20100229106A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US20100227688A1 (en) * 2009-03-06 2010-09-09 Trion World Network, Inc. Synthetic environment character data sharing
US8745726B2 (en) * 2009-05-21 2014-06-03 International Business Machines Corporation Identity verification in virtual worlds using encoded data
US20100299747A1 (en) * 2009-05-21 2010-11-25 International Business Machines Corporation Identity verification in virtual worlds using encoded data
US9032509B2 (en) 2009-05-21 2015-05-12 International Business Machines Corporation Identity verification in virtual worlds using encoded data
US8775653B2 (en) 2009-06-01 2014-07-08 Trion Worlds, Inc. Web client data conversion for synthetic environment interaction
US20120270601A1 (en) * 2009-06-16 2012-10-25 Bran Ferren Intelligent graphics interface in a handheld wireless device
US9195816B2 (en) * 2009-06-16 2015-11-24 Intel Corporation Intelligent graphics interface in a handheld wireless device
US8700012B2 (en) * 2009-06-16 2014-04-15 Intel Corporation Handheld electronic device using status awareness
US20120276932A1 (en) * 2009-06-16 2012-11-01 Bran Ferren Handheld electronic device using status awareness
US20110022443A1 (en) * 2009-07-21 2011-01-27 Palo Alto Research Center Incorporated Employment inference from mobile device data
US8939840B2 (en) * 2009-07-29 2015-01-27 Disney Enterprises, Inc. System and method for playsets using tracked objects and corresponding virtual worlds
US20110028219A1 (en) * 2009-07-29 2011-02-03 Disney Enterprises, Inc. (Burbank, Ca) System and method for playsets using tracked objects and corresponding virtual worlds
US10319376B2 (en) * 2009-09-17 2019-06-11 Avaya Inc. Geo-spatial event processing
US20110066423A1 (en) * 2009-09-17 2011-03-17 Avaya Inc. Speech-Recognition System for Location-Aware Applications
US10503571B2 (en) * 2009-09-23 2019-12-10 Microsoft Technology Licensing, Llc Message communication of sensor and other data
US20170091005A1 (en) * 2009-09-23 2017-03-30 Microsoft Technology Licensing, Llc Message communication of sensor and other data
US20150319590A1 (en) * 2009-10-06 2015-11-05 Facebook, Inc. Sharing of location-based content item in social networking service
US10117044B2 (en) * 2009-10-06 2018-10-30 Facebook, Inc. Sharing of location-based content item in social networking service
US20110084800A1 (en) * 2009-10-14 2011-04-14 Lee-Chun Ko Access Authorization Method And Apparatus For A Wireless Sensor Network
US8461963B2 (en) * 2009-10-14 2013-06-11 Industrial Technology Research Institute Access authorization method and apparatus for a wireless sensor network
US20120293506A1 (en) * 2009-11-10 2012-11-22 Selex Sistemi Integrati S.P.A. Avatar-Based Virtual Collaborative Assistance
US10440129B2 (en) * 2009-12-14 2019-10-08 At&T Intellectual Property I, L.P. Unified location and presence communication across real and virtual worlds
US20120327091A1 (en) * 2010-03-08 2012-12-27 Nokia Corporation Gestural Messages in Social Phonebook
US9460448B2 (en) 2010-03-20 2016-10-04 Nimbelink Corp. Environmental monitoring system which leverages a social networking service to deliver alerts to mobile phones or devices
US9501782B2 (en) 2010-03-20 2016-11-22 Arthur Everett Felgate Monitoring system
US20110264728A1 (en) * 2010-04-23 2011-10-27 Research In Motion Limited Method and apparatus for electronically posting a graphic identifier to a plurality of servers
US20110298618A1 (en) * 2010-06-02 2011-12-08 Apple Inc. Remote User Status Indicators
US9800705B2 (en) * 2010-06-02 2017-10-24 Apple Inc. Remote user status indicators
US20110298808A1 (en) * 2010-06-02 2011-12-08 Toyota Motor Engineering & Manufacturing North America, Inc. Animated Vehicle Attendance Systems
US10843078B2 (en) 2010-06-07 2020-11-24 Affectiva, Inc. Affect usage within a gaming context
US9247903B2 (en) * 2010-06-07 2016-02-02 Affectiva, Inc. Using affect within a gaming context
US20120135804A1 (en) * 2010-06-07 2012-05-31 Daniel Bender Using affect within a gaming context
US9940468B2 (en) * 2010-07-08 2018-04-10 Sony Corporation Preserving user privacy
US20140351324A1 (en) * 2010-07-08 2014-11-27 Sony Corporation Information processing apparatus, information processing method, and program
US8564621B2 (en) * 2010-08-11 2013-10-22 International Business Machines Corporation Replicating changes between corresponding objects
EP2418123A1 (en) * 2010-08-11 2012-02-15 Valeo Schalter und Sensoren GmbH Method and system for supporting a driver of a vehicle in manoeuvring the vehicle on a driving route and portable communication device
US20120038667A1 (en) * 2010-08-11 2012-02-16 International Business Machines Corporation Replicating Changes Between Corresponding Objects
US9413883B2 (en) 2010-09-10 2016-08-09 Google Inc. Call status sharing
US9060059B2 (en) 2010-09-10 2015-06-16 Google Inc. Call status sharing
US10028113B2 (en) 2010-09-21 2018-07-17 Cellepathy Inc. Device control based on number of vehicle occupants
US8750853B2 (en) 2010-09-21 2014-06-10 Cellepathy Ltd. Sensor-based determination of user role, location, and/or state of one or more in-vehicle mobile devices and enforcement of usage thereof
WO2012040392A3 (en) * 2010-09-21 2012-05-31 Cellepathy Ltd. System and method for sensor-based determination of user role, location, and/or state of one of more in-vehicle mobile devices and enforcement of usage thereof
US8290480B2 (en) 2010-09-21 2012-10-16 Cellepathy Ltd. System and method for selectively restricting in-vehicle mobile device usage
US9800716B2 (en) 2010-09-21 2017-10-24 Cellepathy Inc. Restricting mobile device usage
US11070661B2 (en) 2010-09-21 2021-07-20 Cellepathy Inc. Restricting mobile device usage
US9078116B2 (en) 2010-09-21 2015-07-07 Cellepathy Ltd. In-vehicle device location determination and enforcement of usage thereof
US9622055B2 (en) 2010-09-28 2017-04-11 E.Digital Corporation System and method for managing mobile communications
US9641664B2 (en) 2010-09-28 2017-05-02 E.Digital Corporation System, apparatus, and method for utilizing sensor data
US9178983B2 (en) 2010-09-28 2015-11-03 E.Digital Corporation System and method for managing mobile communications
US8881057B2 (en) 2010-11-09 2014-11-04 Blackberry Limited Methods and apparatus to display mobile device contexts
US8229079B2 (en) 2010-12-08 2012-07-24 Google Inc. Propagating user status information across computing resources
WO2012078983A3 (en) * 2010-12-10 2015-03-19 Blueforce Development Corporation Decision support
US9066211B2 (en) 2010-12-10 2015-06-23 Blueforce Development Corporation Decision support
US9405714B2 (en) * 2011-03-22 2016-08-02 Samsung Electronics Co., Ltd. Method and apparatus for managing sensor data and method and apparatus for analyzing sensor data
US20120246261A1 (en) * 2011-03-22 2012-09-27 Roh Yohan J Method and apparatus for managing sensor data and method and apparatus for analyzing sensor data
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US20150103019A1 (en) * 2011-04-22 2015-04-16 Joshua Michael Young Methods and Devices and Systems for Positioning Input Devices and Creating Control
US20160170572A1 (en) * 2011-06-13 2016-06-16 Sony Corporation Information processing device, information processing method, and computer program
US20160283579A1 (en) * 2011-06-13 2016-09-29 Sony Corporation Information processing device, information processing method, and computer program
US10182209B2 (en) * 2011-06-23 2019-01-15 Sony Corporation Information processing apparatus, information processing method, program, and server
US10986312B2 (en) 2011-06-23 2021-04-20 Sony Corporation Information processing apparatus, information processing method, program, and server
US20150237307A1 (en) * 2011-06-23 2015-08-20 Sony Corporation Information processing apparatus, information processing method, program, and server
US8988490B2 (en) * 2011-06-23 2015-03-24 Sony Corporation Information processing apparatus, information processing method, program, and server
US20190098256A1 (en) * 2011-06-23 2019-03-28 Sony Corporation Information processing apparatus, information processing method, program, and server
US10158829B2 (en) * 2011-06-23 2018-12-18 Sony Corporation Information processing apparatus, information processing method, program, and server
US20120327183A1 (en) * 2011-06-23 2012-12-27 Hiromitsu Fujii Information processing apparatus, information processing method, program, and server
US9084001B2 (en) 2011-07-18 2015-07-14 At&T Intellectual Property I, Lp Method and apparatus for multi-experience metadata translation of media content with metadata
US11129259B2 (en) 2011-07-18 2021-09-21 At&T Intellectual Property I, L.P. Method and apparatus for multi-experience metadata translation of media content with metadata
US20130024774A1 (en) * 2011-07-18 2013-01-24 At&T Intellectual Property I, Lp Method and apparatus for multi-experience adaptation of media content
US9473547B2 (en) 2011-07-18 2016-10-18 At&T Intellectual Property I, L.P. Method and apparatus for multi-experience metadata translation of media content with metadata
US10491642B2 (en) 2011-07-18 2019-11-26 At&T Intellectual Property I, L.P. Method and apparatus for multi-experience metadata translation of media content with metadata
US10839596B2 (en) 2011-07-18 2020-11-17 At&T Intellectual Property I, L.P. Method and apparatus for multi-experience adaptation of media content
US9940748B2 (en) 2011-07-18 2018-04-10 At&T Intellectual Property I, L.P. Method and apparatus for multi-experience adaptation of media content
US8943396B2 (en) * 2011-07-18 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for multi-experience adaptation of media content
US9237362B2 (en) 2011-08-11 2016-01-12 At&T Intellectual Property I, Lp Method and apparatus for multi-experience translation of media content with sensor sharing
US9189076B2 (en) 2011-08-11 2015-11-17 At&T Intellectual Property I, Lp Method and apparatus for controlling multi-experience translation of media content
US9430048B2 (en) 2011-08-11 2016-08-30 At&T Intellectual Property I, L.P. Method and apparatus for controlling multi-experience translation of media content
US10812842B2 (en) 2011-08-11 2020-10-20 At&T Intellectual Property I, L.P. Method and apparatus for multi-experience translation of media content with sensor sharing
US9851807B2 (en) 2011-08-11 2017-12-26 At&T Intellectual Property I, L.P. Method and apparatus for controlling multi-experience translation of media content
US11507733B2 (en) * 2011-08-18 2022-11-22 Pfaqutruma Research Llc System and methods of virtual world interaction
US20130155169A1 (en) * 2011-12-14 2013-06-20 Verizon Corporate Services Group Inc. Method and system for providing virtual conferencing
US9007427B2 (en) * 2011-12-14 2015-04-14 Verizon Patent And Licensing Inc. Method and system for providing virtual conferencing
EP2793432A4 (en) * 2012-01-09 2014-12-03 Huawei Tech Co Ltd Method for displaying user state, display terminal and server
EP2793432A1 (en) * 2012-01-09 2014-10-22 Huawei Technologies Co., Ltd Method for displaying user state, display terminal and server
US9509787B2 (en) 2012-01-09 2016-11-29 Huawei Technologies Co., Ltd. User status displaying method, and server
US9763070B2 (en) 2012-01-18 2017-09-12 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US10516979B2 (en) 2012-01-18 2019-12-24 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US10390191B2 (en) 2012-01-18 2019-08-20 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US10575145B1 (en) 2012-01-18 2020-02-25 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US9294428B2 (en) 2012-01-18 2016-03-22 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US10117074B2 (en) 2012-01-18 2018-10-30 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US9584464B2 (en) 2012-01-18 2017-02-28 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US10117075B1 (en) 2012-01-18 2018-10-30 Kinectus, Llc Systems and methods for establishing communications between mobile device users
US9064243B2 (en) * 2012-02-16 2015-06-23 Blackberry Limited System and method for communicating presence status
US20130217350A1 (en) * 2012-02-16 2013-08-22 Research In Motion Corporation System and method for communicating presence status
US10004056B2 (en) 2012-03-01 2018-06-19 Microsoft Technology Licensing, Llc Requesting a location of a user
US10856251B2 (en) * 2012-03-01 2020-12-01 Microsoft Technology Licensing, Llc Requesting a location of a user
WO2013151759A1 (en) * 2012-04-06 2013-10-10 Liveone Group, Ltd. A social media application for a media content providing platform
US20150088622A1 (en) * 2012-04-06 2015-03-26 LiveOne, Inc. Social media application for a media content providing platform
US11229849B2 (en) 2012-05-08 2022-01-25 Snap Inc. System and method for generating and displaying avatars
US11607616B2 (en) 2012-05-08 2023-03-21 Snap Inc. System and method for generating and displaying avatars
US11925869B2 (en) 2012-05-08 2024-03-12 Snap Inc. System and method for generating and displaying avatars
US20150142887A1 (en) * 2012-05-21 2015-05-21 Zte Corporation Device, method and mobile terminal for updating mobile social network user state
EP2854431A4 (en) * 2012-05-21 2015-06-17 Zte Corp Device, method and mobile terminal for updating mobile social network user state
US9691115B2 (en) 2012-06-21 2017-06-27 Cellepathy Inc. Context determination using access points in transportation and other scenarios
US11765543B2 (en) 2012-06-22 2023-09-19 Google Llc Presenting information for a current location or time
US9146114B2 (en) 2012-06-22 2015-09-29 Google Inc. Presenting information for a current location or time
US9587947B2 (en) 2012-06-22 2017-03-07 Google Inc. Presenting information for a current location or time
US10996057B2 (en) 2012-06-22 2021-05-04 Google Llc Presenting information for a current location or time
US10168155B2 (en) 2012-06-22 2019-01-01 Google Llc Presenting information for a current location or time
US9641635B2 (en) 2012-08-28 2017-05-02 Tata Consultancy Services Limited Dynamic selection of reliability of publishing data
WO2014049603A3 (en) * 2012-08-28 2014-05-30 Tata Consultancy Services Limited Dynamic selection of reliability of publishing data
US20160006987A1 (en) * 2012-09-06 2016-01-07 Wenlong Li System and method for avatar creation and synchronization
US9936165B2 (en) * 2012-09-06 2018-04-03 Intel Corporation System and method for avatar creation and synchronization
US9774555B2 (en) * 2012-09-14 2017-09-26 Salesforce.Com, Inc. Computer implemented methods and apparatus for managing objectives in an organization in a social network environment
US20140082107A1 (en) * 2012-09-14 2014-03-20 Salesforce.Com, Inc. Computer implemented methods and apparatus for managing objectives in an organization in a social network environment
US20140115092A1 (en) * 2012-10-19 2014-04-24 At&T Intellectual Property I, L.P. Sensory communication sessions over a network
US10477261B2 (en) * 2012-10-19 2019-11-12 At&T Intellectual Property I, L.P. Sensory communication sessions over a network
US20140162711A1 (en) * 2012-12-06 2014-06-12 At&T Intellectual Property I, L.P. Collecting And Analyzing Data In A Distributed Sensor Network
US10660155B2 (en) * 2012-12-06 2020-05-19 At&T Intellectual Property I, L.P. Collecting and analyzing data in a distributed sensor network
US9288611B2 (en) * 2012-12-06 2016-03-15 At&T Intellectual Property I, L.P. Collecting and analyzing data in a distributed sensor network
US8712788B1 (en) * 2013-01-30 2014-04-29 Nadira S. Morales-Pavon Method of publicly displaying a person's relationship status
EP2974405A4 (en) * 2013-03-15 2016-11-16 Samsung Electronics Co Ltd Communication system with identification management and method of operation thereof
US20180270606A1 (en) * 2013-03-15 2018-09-20 Athoc, Inc. Personnel status tracking system in crisis management situations
US9986374B2 (en) * 2013-03-15 2018-05-29 Athoc, Inc. Personnel crisis communications management system
US10917775B2 (en) * 2013-03-15 2021-02-09 Athoc, Inc. Personnel status tracking system in crisis management situations
US20160174027A1 (en) * 2013-03-15 2016-06-16 Athoc, Inc. Personnel Crisis Communications Management System
US20140344687A1 (en) * 2013-05-16 2014-11-20 Lenitra Durham Techniques for Natural User Interface Input based on Context
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US9210547B2 (en) * 2013-07-30 2015-12-08 Here Global B.V. Mobile driving condition detection
US20150038123A1 (en) * 2013-07-30 2015-02-05 Here Global B.V. Mobile Driving Condition Detection
US20150095310A1 (en) * 2013-09-27 2015-04-02 Here Global B.V. Method and apparatus for determining status updates associated with elements in a media item
US9984076B2 (en) * 2013-09-27 2018-05-29 Here Global B.V. Method and apparatus for determining status updates associated with elements in a media item
US20150091936A1 (en) * 2013-09-27 2015-04-02 Lenovo (Beijing) Co., Ltd. Information processing method and electronic device
US20160266857A1 (en) * 2013-12-12 2016-09-15 Samsung Electronics Co., Ltd. Method and apparatus for displaying image information
US11651797B2 (en) 2014-02-05 2023-05-16 Snap Inc. Real time video processing for changing proportions of an object in the video
US11443772B2 (en) 2014-02-05 2022-09-13 Snap Inc. Method for triggering events in a video
US10991395B1 (en) 2014-02-05 2021-04-27 Snap Inc. Method for real time video processing involving changing a color of an object on a human face in a video
WO2015138364A1 (en) * 2014-03-13 2015-09-17 Microsoft Technology Licensing, Llc User work schedule identification
US20150262132A1 (en) * 2014-03-13 2015-09-17 Microsoft Corporation User work schedule identification
CN106663233A (en) * 2014-03-13 2017-05-10 微软技术许可有限责任公司 User work schedule identification
US10586216B2 (en) * 2014-03-13 2020-03-10 Microsoft Technology Licensing, Llc User work schedule identification
US10768790B2 (en) 2014-04-01 2020-09-08 Hallmark Cards, Incorporated Augmented reality appearance enhancement
US20150279117A1 (en) * 2014-04-01 2015-10-01 Hallmark Cards, Incorporated Augmented Reality Appearance Enhancement
US9977572B2 (en) * 2014-04-01 2018-05-22 Hallmark Cards, Incorporated Augmented reality appearance enhancement
US11429250B2 (en) 2014-04-01 2022-08-30 Hallmark Cards, Incorporated Augmented reality appearance enhancement
US20180267677A1 (en) * 2014-04-01 2018-09-20 Hallmark Cards, Incorporated Augmented reality appearance enhancement
WO2015164951A1 (en) * 2014-05-01 2015-11-05 Abbas Mohamad Methods and systems relating to personalized evolving avatars
US20150358201A1 (en) * 2014-06-09 2015-12-10 Samsung Electronics Co., Ltd. Wearable electronic device, main electronic device, system and control method thereof
US11032137B2 (en) * 2014-06-09 2021-06-08 Samsung Electronics Co., Ltd. Wearable electronic device, main electronic device, system and control method thereof
US11637747B2 (en) 2014-06-09 2023-04-25 Samsung Electronics Co., Ltd. Wearable electronic device, main electronic device, system and control method thereof
US11423490B2 (en) * 2014-06-27 2022-08-23 Intel Corporation Socially and contextually appropriate recommendation systems
US9503516B2 (en) 2014-08-06 2016-11-22 Google Technology Holdings LLC Context-based contact notification
US20160292903A1 (en) * 2014-09-24 2016-10-06 Intel Corporation Avatar audio communication systems and techniques
CN105573573B (en) * 2014-10-31 2021-02-12 三星电子株式会社 Apparatus and method for managing user information based on image
CN105573573A (en) * 2014-10-31 2016-05-11 三星电子株式会社 Device and method of managing user information based on image
US20190026933A1 (en) * 2014-10-31 2019-01-24 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
US20160125635A1 (en) * 2014-10-31 2016-05-05 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
TWI680400B (en) * 2014-10-31 2019-12-21 南韓商三星電子股份有限公司 Device and method of managing user information based on image
WO2016068581A1 (en) * 2014-10-31 2016-05-06 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
US11024070B2 (en) * 2014-10-31 2021-06-01 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
US10096143B2 (en) * 2014-10-31 2018-10-09 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
EP3015145A1 (en) * 2014-10-31 2016-05-04 Samsung Electronics Co., Ltd. Device and method of managing user information based on image
US10025491B2 (en) * 2014-12-30 2018-07-17 PIQPIQ, Inc. Social messaging system for real-time selection and sorting of photo and video content
US20160188153A1 (en) * 2014-12-30 2016-06-30 PIQPIQ, Inc. Social messaging system for real-time selection and sorting of photo and video content
CN104917882A (en) * 2015-04-30 2015-09-16 努比亚技术有限公司 Method and terminal for achieving cell electricity quantity display
US20180157377A1 (en) * 2015-05-12 2018-06-07 Samsung Electronics Co., Ltd. Electronic device and method for providing graphic user interface therefor
US20160343063A1 (en) * 2015-05-18 2016-11-24 Ebay Inc. Replaced device handler
US10902507B2 (en) * 2015-05-18 2021-01-26 Ebay Inc. Replaced device handler
US10231668B2 (en) * 2015-11-13 2019-03-19 International Business Machines Corporation Instant messaging status reporting based on smart watch activity
US10952673B2 (en) * 2015-11-13 2021-03-23 International Business Machines Corporation Instant messaging status reporting based on smart watch activity
US20180115746A1 (en) * 2015-11-17 2018-04-26 Tencent Technology (Shenzhen) Company Limited Video calling method and apparatus
US10218937B2 (en) * 2015-11-17 2019-02-26 Tencent Technology (Shenzhen) Company Limited Video calling method and apparatus
US11048916B2 (en) 2016-03-31 2021-06-29 Snap Inc. Automated avatar generation
US11631276B2 (en) 2016-03-31 2023-04-18 Snap Inc. Automated avatar generation
US11662900B2 (en) 2016-05-31 2023-05-30 Snap Inc. Application control using a gesture based trigger
US10824795B2 (en) * 2016-06-21 2020-11-03 Fernando J. Pinho Indoor positioning and recording system
US10984569B2 (en) 2016-06-30 2021-04-20 Snap Inc. Avatar based ideogram generation
US10848446B1 (en) 2016-07-19 2020-11-24 Snap Inc. Displaying customized electronic messaging graphics
US10855632B2 (en) 2016-07-19 2020-12-01 Snap Inc. Displaying customized electronic messaging graphics
US11438288B2 (en) 2016-07-19 2022-09-06 Snap Inc. Displaying customized electronic messaging graphics
US11509615B2 (en) 2016-07-19 2022-11-22 Snap Inc. Generating customized electronic messaging graphics
US10348662B2 (en) * 2016-07-19 2019-07-09 Snap Inc. Generating customized electronic messaging graphics
US11418470B2 (en) 2016-07-19 2022-08-16 Snap Inc. Displaying customized electronic messaging graphics
CN106169227A (en) * 2016-08-31 2016-11-30 广东小天才科技有限公司 A kind of method and wearable device reminded of going on a journey
US11520479B2 (en) 2016-09-01 2022-12-06 PIQPIQ, Inc. Mass media presentations with synchronized audio reactions
US11438341B1 (en) 2016-10-10 2022-09-06 Snap Inc. Social media post subscribe requests for buffer user accounts
US11100311B2 (en) 2016-10-19 2021-08-24 Snap Inc. Neural networks for facial modeling
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation
US11876762B1 (en) 2016-10-24 2024-01-16 Snap Inc. Generating and displaying customized avatars in media overlays
US11218433B2 (en) 2016-10-24 2022-01-04 Snap Inc. Generating and displaying customized avatars in electronic messages
US10880246B2 (en) 2016-10-24 2020-12-29 Snap Inc. Generating and displaying customized avatars in electronic messages
US11843456B2 (en) 2016-10-24 2023-12-12 Snap Inc. Generating and displaying customized avatars in media overlays
US10938758B2 (en) 2016-10-24 2021-03-02 Snap Inc. Generating and displaying customized avatars in media overlays
WO2018081235A1 (en) * 2016-10-27 2018-05-03 Wal-Mart Stores, Inc. Systems and methods for adjusting the display quality of an avatar during an online or virtual shopping session
GB2570076A (en) * 2016-10-27 2019-07-10 Walmart Apollo Llc Systems and methods for adjusting the display quality of an avatar during an online or virtual shopping session
GB2570076B (en) * 2016-10-27 2022-02-23 Walmart Apollo Llc Systems and methods for adjusting the display quality of an avatar during an online or virtual shopping session
US10950275B2 (en) * 2016-11-18 2021-03-16 Facebook, Inc. Methods and systems for tracking media effects in a media effect index
US20180144775A1 (en) * 2016-11-18 2018-05-24 Facebook, Inc. Methods and Systems for Tracking Media Effects in a Media Effect Index
US10867163B1 (en) 2016-11-29 2020-12-15 Facebook, Inc. Face detection for video calls
US10554908B2 (en) 2016-12-05 2020-02-04 Facebook, Inc. Media effect application
US11704878B2 (en) 2017-01-09 2023-07-18 Snap Inc. Surface aware lens
US11616745B2 (en) 2017-01-09 2023-03-28 Snap Inc. Contextual generation and selection of customized media content
US10423833B2 (en) * 2017-01-12 2019-09-24 International Business Machines Corporation Setting a personal status using augmented reality
US10152636B2 (en) 2017-01-12 2018-12-11 International Business Machines Corporation Setting a personal status using augmented reality
US20180285645A1 (en) * 2017-01-12 2018-10-04 International Business Machines Corporation Setting a personal status using augmented reality
US11544883B1 (en) 2017-01-16 2023-01-03 Snap Inc. Coded vision system
US10951562B2 (en) 2017-01-18 2021-03-16 Snap. Inc. Customized contextual media content item generation
US11870743B1 (en) 2017-01-23 2024-01-09 Snap Inc. Customized digital avatar accessories
US10931783B2 (en) * 2017-01-25 2021-02-23 International Business Machines Corporation Targeted profile picture selection
US20190379760A1 (en) * 2017-01-25 2019-12-12 International Business Machines Corporation Targeted profile picture selection
US10149116B1 (en) 2017-01-27 2018-12-04 Allstate Insurance Company Early notification of driving status to a mobile device
US10880708B1 (en) 2017-01-27 2020-12-29 Allstate Insurance Company Early notification of driving status to a mobile device
US10560824B1 (en) 2017-01-27 2020-02-11 Allstate Insurance Company Early notification of driving status to a mobile device
CN110603539A (en) * 2017-02-07 2019-12-20 Iot控股公司 System and method for preventing monitoring and protecting privacy in virtual reality
US11443059B2 (en) * 2017-02-07 2022-09-13 Iot Holdings, Inc. System and method to prevent surveillance and preserve privacy in virtual reality
US10802683B1 (en) 2017-02-16 2020-10-13 Cisco Technology, Inc. Method, system and computer program product for changing avatars in a communication application display
US10438394B2 (en) * 2017-03-02 2019-10-08 Colopl, Inc. Information processing method, virtual space delivering system and apparatus therefor
US10775232B2 (en) 2017-03-13 2020-09-15 Omron Corporation Environmental sensor
US11069103B1 (en) 2017-04-20 2021-07-20 Snap Inc. Customized user interface for electronic communications
US11593980B2 (en) 2017-04-20 2023-02-28 Snap Inc. Customized user interface for electronic communications
US11418906B2 (en) 2017-04-27 2022-08-16 Snap Inc. Selective location-based identity communication
US11385763B2 (en) 2017-04-27 2022-07-12 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
US11842411B2 (en) * 2017-04-27 2023-12-12 Snap Inc. Location-based virtual avatars
US10963529B1 (en) 2017-04-27 2021-03-30 Snap Inc. Location-based search mechanism in a graphical user interface
US11893647B2 (en) * 2017-04-27 2024-02-06 Snap Inc. Location-based virtual avatars
US11474663B2 (en) 2017-04-27 2022-10-18 Snap Inc. Location-based search mechanism in a graphical user interface
US11782574B2 (en) 2017-04-27 2023-10-10 Snap Inc. Map-based graphical user interface indicating geospatial activity metrics
CN110799937A (en) * 2017-04-27 2020-02-14 斯纳普公司 Location-based virtual avatar
WO2018200042A1 (en) * 2017-04-27 2018-11-01 Snap Inc. Location-based virtual avatars
US11451956B1 (en) 2017-04-27 2022-09-20 Snap Inc. Location privacy management on map-based social media platforms
US11392264B1 (en) 2017-04-27 2022-07-19 Snap Inc. Map-based graphical user interface for multi-type social media galleries
US10952013B1 (en) 2017-04-27 2021-03-16 Snap Inc. Selective location-based identity communication
US11830209B2 (en) 2017-05-26 2023-11-28 Snap Inc. Neural network-based image stream modification
US11882162B2 (en) 2017-07-28 2024-01-23 Snap Inc. Software application manager for messaging applications
US11122094B2 (en) 2017-07-28 2021-09-14 Snap Inc. Software application manager for messaging applications
US11659014B2 (en) 2017-07-28 2023-05-23 Snap Inc. Software application manager for messaging applications
US11178359B2 (en) * 2017-09-26 2021-11-16 Hewlett-Packard Development Company, L.P. Electronic device and generating conference call participants identifications
US11610354B2 (en) 2017-10-26 2023-03-21 Snap Inc. Joint audio-video facial animation system
US11120597B2 (en) 2017-10-26 2021-09-14 Snap Inc. Joint audio-video facial animation system
US11706267B2 (en) 2017-10-30 2023-07-18 Snap Inc. Animated chat presence
US11930055B2 (en) 2017-10-30 2024-03-12 Snap Inc. Animated chat presence
US11030789B2 (en) 2017-10-30 2021-06-08 Snap Inc. Animated chat presence
US11354843B2 (en) 2017-10-30 2022-06-07 Snap Inc. Animated chat presence
US10304229B1 (en) * 2017-11-21 2019-05-28 International Business Machines Corporation Cognitive multi-layered real-time visualization of a user's sensed information
US20190197753A1 (en) * 2017-11-21 2019-06-27 International Business Machines Corporation Cognitive multi-layered real-time visualization of a user's sensed information
US10839579B2 (en) * 2017-11-21 2020-11-17 International Business Machines Corporation Cognitive multi-layered real-time visualization of a user's sensed information
US11460974B1 (en) 2017-11-28 2022-10-04 Snap Inc. Content discovery refresh
US11411895B2 (en) 2017-11-29 2022-08-09 Snap Inc. Generating aggregated media content items for a group of users in an electronic messaging application
US10936157B2 (en) 2017-11-29 2021-03-02 Snap Inc. Selectable item including a customized graphic for an electronic messaging application
US11189102B2 (en) * 2017-12-22 2021-11-30 Samsung Electronics Co., Ltd. Electronic device for displaying object for augmented reality and operation method therefor
US11102020B2 (en) * 2017-12-27 2021-08-24 Sharp Kabushiki Kaisha Information processing device, information processing system, and information processing method
US11769259B2 (en) 2018-01-23 2023-09-26 Snap Inc. Region-based stabilized face tracking
US10949648B1 (en) 2018-01-23 2021-03-16 Snap Inc. Region-based stabilized face tracking
US20190236622A1 (en) * 2018-01-30 2019-08-01 Robert Swanson Systems and methods for utilizing crowdsourcing to implement actions
US11688119B2 (en) 2018-02-28 2023-06-27 Snap Inc. Animated expressive icon
US11468618B2 (en) 2018-02-28 2022-10-11 Snap Inc. Animated expressive icon
US11880923B2 (en) 2018-02-28 2024-01-23 Snap Inc. Animated expressive icon
US10979752B1 (en) * 2018-02-28 2021-04-13 Snap Inc. Generating media content items based on location information
US11523159B2 (en) * 2018-02-28 2022-12-06 Snap Inc. Generating media content items based on location information
US11120601B2 (en) 2018-02-28 2021-09-14 Snap Inc. Animated expressive icon
US11310176B2 (en) 2018-04-13 2022-04-19 Snap Inc. Content suggestion system
US11875439B2 (en) 2018-04-18 2024-01-16 Snap Inc. Augmented expression system
US11721093B2 (en) 2018-04-20 2023-08-08 Meta Platforms, Inc. Content summarization for assistant systems
US20210224346A1 (en) 2018-04-20 2021-07-22 Facebook, Inc. Engaging Users by Personalized Composing-Content Recommendation
US11908179B2 (en) 2018-04-20 2024-02-20 Meta Platforms, Inc. Suggestions for fallback social contacts for assistant systems
US11727677B2 (en) 2018-04-20 2023-08-15 Meta Platforms Technologies, Llc Personalized gesture recognition for user interaction with assistant systems
US11869231B2 (en) 2018-04-20 2024-01-09 Meta Platforms Technologies, Llc Auto-completion for gesture-input in assistant systems
US11715042B1 (en) 2018-04-20 2023-08-01 Meta Platforms Technologies, Llc Interpretability of deep reinforcement learning models in assistant systems
US11908181B2 (en) 2018-04-20 2024-02-20 Meta Platforms, Inc. Generating multi-perspective responses by assistant systems
US11715289B2 (en) 2018-04-20 2023-08-01 Meta Platforms, Inc. Generating multi-perspective responses by assistant systems
US20210183397A1 (en) * 2018-04-20 2021-06-17 Facebook, Inc. Multiple Wake Words for Systems with Multiple Smart Assistants
US11688159B2 (en) 2018-04-20 2023-06-27 Meta Platforms, Inc. Engaging users by personalized composing-content recommendation
US11704899B2 (en) 2018-04-20 2023-07-18 Meta Platforms, Inc. Resolving entities from multiple data sources for assistant systems
US11887359B2 (en) 2018-04-20 2024-01-30 Meta Platforms, Inc. Content suggestions for content digests for assistant systems
US11886473B2 (en) 2018-04-20 2024-01-30 Meta Platforms, Inc. Intent identification for agent matching by assistant systems
US11694429B2 (en) 2018-04-20 2023-07-04 Meta Platforms Technologies, Llc Auto-completion for gesture-input in assistant systems
US20230186618A1 (en) 2018-04-20 2023-06-15 Meta Platforms, Inc. Generating Multi-Perspective Responses by Assistant Systems
US11676220B2 (en) 2018-04-20 2023-06-13 Meta Platforms, Inc. Processing multimodal user input for assistant systems
US11704900B2 (en) 2018-04-20 2023-07-18 Meta Platforms, Inc. Predictive injection of conversation fillers for assistant systems
US11074675B2 (en) 2018-07-31 2021-07-27 Snap Inc. Eye texture inpainting
US11715268B2 (en) 2018-08-30 2023-08-01 Snap Inc. Video clip object tracking
US11380215B2 (en) * 2018-08-30 2022-07-05 Kyndryl, Inc. Reward-based ecosystem for tracking nutritional consumption
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US11348301B2 (en) 2018-09-19 2022-05-31 Snap Inc. Avatar style transformation using neural networks
US10896534B1 (en) 2018-09-19 2021-01-19 Snap Inc. Avatar style transformation using neural networks
US10902659B2 (en) 2018-09-19 2021-01-26 International Business Machines Corporation Intelligent photograph overlay in an internet of things (IoT) computing environment
US11868590B2 (en) 2018-09-25 2024-01-09 Snap Inc. Interface to display shared user groups
US11294545B2 (en) 2018-09-25 2022-04-05 Snap Inc. Interface to display shared user groups
US10895964B1 (en) 2018-09-25 2021-01-19 Snap Inc. Interface to display shared user groups
US11477149B2 (en) 2018-09-28 2022-10-18 Snap Inc. Generating customized graphics having reactions to electronic message content
US11610357B2 (en) 2018-09-28 2023-03-21 Snap Inc. System and method of generating targeted user lists using customizable avatar characteristics
US11245658B2 (en) 2018-09-28 2022-02-08 Snap Inc. System and method of generating private notifications between users in a communication session
US11171902B2 (en) 2018-09-28 2021-11-09 Snap Inc. Generating customized graphics having reactions to electronic message content
US10904181B2 (en) 2018-09-28 2021-01-26 Snap Inc. Generating customized graphics having reactions to electronic message content
US11824822B2 (en) 2018-09-28 2023-11-21 Snap Inc. Generating customized graphics having reactions to electronic message content
US11455082B2 (en) 2018-09-28 2022-09-27 Snap Inc. Collaborative achievement interface
US11704005B2 (en) 2018-09-28 2023-07-18 Snap Inc. Collaborative achievement interface
US11189070B2 (en) 2018-09-28 2021-11-30 Snap Inc. System and method of generating targeted user lists using customizable avatar characteristics
US11103795B1 (en) 2018-10-31 2021-08-31 Snap Inc. Game drawer
US10872451B2 (en) 2018-10-31 2020-12-22 Snap Inc. 3D avatar rendering
US11321896B2 (en) 2018-10-31 2022-05-03 Snap Inc. 3D avatar rendering
US20220044479A1 (en) 2018-11-27 2022-02-10 Snap Inc. Textured mesh building
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
US11620791B2 (en) 2018-11-27 2023-04-04 Snap Inc. Rendering 3D captions within real-world environments
US11836859B2 (en) 2018-11-27 2023-12-05 Snap Inc. Textured mesh building
US10902661B1 (en) 2018-11-28 2021-01-26 Snap Inc. Dynamic composite user identifier
US11887237B2 (en) 2018-11-28 2024-01-30 Snap Inc. Dynamic composite user identifier
US11698722B2 (en) 2018-11-30 2023-07-11 Snap Inc. Generating customized avatars based on location information
US11199957B1 (en) 2018-11-30 2021-12-14 Snap Inc. Generating customized avatars based on location information
US11783494B2 (en) 2018-11-30 2023-10-10 Snap Inc. Efficient human pose tracking in videos
US11315259B2 (en) 2018-11-30 2022-04-26 Snap Inc. Efficient human pose tracking in videos
US10861170B1 (en) 2018-11-30 2020-12-08 Snap Inc. Efficient human pose tracking in videos
US11798261B2 (en) 2018-12-14 2023-10-24 Snap Inc. Image face manipulation
US11055514B1 (en) 2018-12-14 2021-07-06 Snap Inc. Image face manipulation
US11516173B1 (en) 2018-12-26 2022-11-29 Snap Inc. Message composition interface
US11877211B2 (en) 2019-01-14 2024-01-16 Snap Inc. Destination sharing in location sharing system
US11032670B1 (en) 2019-01-14 2021-06-08 Snap Inc. Destination sharing in location sharing system
US11751015B2 (en) 2019-01-16 2023-09-05 Snap Inc. Location-based context information sharing in a messaging system
US10939246B1 (en) 2019-01-16 2021-03-02 Snap Inc. Location-based context information sharing in a messaging system
US10945098B2 (en) 2019-01-16 2021-03-09 Snap Inc. Location-based context information sharing in a messaging system
US11294936B1 (en) 2019-01-30 2022-04-05 Snap Inc. Adaptive spatial density based clustering
US11693887B2 (en) 2019-01-30 2023-07-04 Snap Inc. Adaptive spatial density based clustering
US11557075B2 (en) 2019-02-06 2023-01-17 Snap Inc. Body pose estimation
US11714524B2 (en) 2019-02-06 2023-08-01 Snap Inc. Global event-based avatar
US10984575B2 (en) 2019-02-06 2021-04-20 Snap Inc. Body pose estimation
US11010022B2 (en) 2019-02-06 2021-05-18 Snap Inc. Global event-based avatar
US11809624B2 (en) 2019-02-13 2023-11-07 Snap Inc. Sleep detection in a location sharing system
US11275439B2 (en) 2019-02-13 2022-03-15 Snap Inc. Sleep detection in a location sharing system
US10936066B1 (en) 2019-02-13 2021-03-02 Snap Inc. Sleep detection in a location sharing system
US10964082B2 (en) 2019-02-26 2021-03-30 Snap Inc. Avatar based on weather
US11574431B2 (en) 2019-02-26 2023-02-07 Snap Inc. Avatar based on weather
US11301117B2 (en) 2019-03-08 2022-04-12 Snap Inc. Contextual information in chat
US10852918B1 (en) 2019-03-08 2020-12-01 Snap Inc. Contextual information in chat
US11868414B1 (en) 2019-03-14 2024-01-09 Snap Inc. Graph-based prediction for contact suggestion in a location sharing system
US11852554B1 (en) 2019-03-21 2023-12-26 Snap Inc. Barometer calibration in a location sharing system
US11039270B2 (en) 2019-03-28 2021-06-15 Snap Inc. Points of interest in a location sharing system
US20220317861A1 (en) * 2019-03-28 2022-10-06 Snap Inc. Generating personalized map interface with enhanced icons
CN113632497A (en) * 2019-03-28 2021-11-09 斯纳普公司 Generating personalized map interfaces with enhanced icons
US11740760B2 (en) * 2019-03-28 2023-08-29 Snap Inc. Generating personalized map interface with enhanced icons
US11166123B1 (en) 2019-03-28 2021-11-02 Snap Inc. Grouped transmission of location data in a location sharing system
US11249614B2 (en) * 2019-03-28 2022-02-15 Snap Inc. Generating personalized map interface with enhanced icons
US11638115B2 (en) 2019-03-28 2023-04-25 Snap Inc. Points of interest in a location sharing system
US10992619B2 (en) 2019-04-30 2021-04-27 Snap Inc. Messaging system with avatar generation
US11842729B1 (en) * 2019-05-08 2023-12-12 Apple Inc. Method and device for presenting a CGR environment based on audio data and lyric data
USD916872S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916809S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916811S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
USD916810S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a graphical user interface
USD916871S1 (en) 2019-05-28 2021-04-20 Snap Inc. Display screen or portion thereof with a transitional graphical user interface
US11601783B2 (en) 2019-06-07 2023-03-07 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11917495B2 (en) 2019-06-07 2024-02-27 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US10893385B1 (en) 2019-06-07 2021-01-12 Snap Inc. Detection of a physical collision between two client devices in a location sharing system
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
US11823341B2 (en) 2019-06-28 2023-11-21 Snap Inc. 3D object camera customization system
US11443491B2 (en) 2019-06-28 2022-09-13 Snap Inc. 3D object camera customization system
US11188190B2 (en) 2019-06-28 2021-11-30 Snap Inc. Generating animation overlays in a communication session
US11676199B2 (en) 2019-06-28 2023-06-13 Snap Inc. Generating customizable avatar outfits
US11307747B2 (en) 2019-07-11 2022-04-19 Snap Inc. Edge gesture interface with smart interactions
US11714535B2 (en) 2019-07-11 2023-08-01 Snap Inc. Edge gesture interface with smart interactions
US11282352B2 (en) 2019-07-12 2022-03-22 Carrier Corporation Security system with distributed audio and video sources
US11158174B2 (en) 2019-07-12 2021-10-26 Carrier Corporation Security system with distributed audio and video sources
US11455081B2 (en) 2019-08-05 2022-09-27 Snap Inc. Message thread prioritization interface
US11588772B2 (en) 2019-08-12 2023-02-21 Snap Inc. Message reminder interface
US10911387B1 (en) 2019-08-12 2021-02-02 Snap Inc. Message reminder interface
US11822774B2 (en) 2019-09-16 2023-11-21 Snap Inc. Messaging system with battery level sharing
US11320969B2 (en) 2019-09-16 2022-05-03 Snap Inc. Messaging system with battery level sharing
US11662890B2 (en) 2019-09-16 2023-05-30 Snap Inc. Messaging system with battery level sharing
US11425062B2 (en) 2019-09-27 2022-08-23 Snap Inc. Recommended content viewed by friends
US11270491B2 (en) 2019-09-30 2022-03-08 Snap Inc. Dynamic parameterized user avatar stories
US11080917B2 (en) 2019-09-30 2021-08-03 Snap Inc. Dynamic parameterized user avatar stories
US11676320B2 (en) 2019-09-30 2023-06-13 Snap Inc. Dynamic media collection generation
US11218838B2 (en) 2019-10-31 2022-01-04 Snap Inc. Focused map-based context information surfacing
US11563702B2 (en) 2019-12-03 2023-01-24 Snap Inc. Personalized avatar notification
US11063891B2 (en) 2019-12-03 2021-07-13 Snap Inc. Personalized avatar notification
US11128586B2 (en) 2019-12-09 2021-09-21 Snap Inc. Context sensitive avatar captions
US11582176B2 (en) 2019-12-09 2023-02-14 Snap Inc. Context sensitive avatar captions
US11036989B1 (en) 2019-12-11 2021-06-15 Snap Inc. Skeletal tracking using previous frames
US11594025B2 (en) 2019-12-11 2023-02-28 Snap Inc. Skeletal tracking using previous frames
US11908093B2 (en) 2019-12-19 2024-02-20 Snap Inc. 3D captions with semantic graphical elements
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11636657B2 (en) 2019-12-19 2023-04-25 Snap Inc. 3D captions with semantic graphical elements
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11810220B2 (en) 2019-12-19 2023-11-07 Snap Inc. 3D captions with face tracking
US20210200426A1 (en) * 2019-12-27 2021-07-01 Snap Inc. Expressive user icons in a map-based messaging system interface
US11140515B1 (en) 2019-12-30 2021-10-05 Snap Inc. Interfaces for relative device positioning
US11128715B1 (en) 2019-12-30 2021-09-21 Snap Inc. Physical friend proximity in chat
US11893208B2 (en) 2019-12-31 2024-02-06 Snap Inc. Combined map icon with action indicator
US11169658B2 (en) 2019-12-31 2021-11-09 Snap Inc. Combined map icon with action indicator
US20210211487A1 (en) * 2020-01-08 2021-07-08 LINE Plus Corporation Method and system for sharing avatars through instant messaging application
US11507796B2 (en) * 2020-01-08 2022-11-22 LINE Plus Corporation Method and system for sharing avatars through instant messaging application
US11651022B2 (en) 2020-01-30 2023-05-16 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11356720B2 (en) 2020-01-30 2022-06-07 Snap Inc. Video generation system to render frames on demand
US20210243503A1 (en) * 2020-01-30 2021-08-05 Snap Inc Selecting avatars to be included in the video being generated on demand
US11651539B2 (en) 2020-01-30 2023-05-16 Snap Inc. System for generating media content items on demand
US11729441B2 (en) 2020-01-30 2023-08-15 Snap Inc. Video generation system to render frames on demand
US11263254B2 (en) 2020-01-30 2022-03-01 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11284144B2 (en) 2020-01-30 2022-03-22 Snap Inc. Video generation system to render frames on demand using a fleet of GPUs
US11036781B1 (en) 2020-01-30 2021-06-15 Snap Inc. Video generation system to render frames on demand using a fleet of servers
US11831937B2 (en) 2020-01-30 2023-11-28 Snap Inc. Video generation system to render frames on demand using a fleet of GPUS
US11619501B2 (en) 2020-03-11 2023-04-04 Snap Inc. Avatar based on trip
US11217020B2 (en) 2020-03-16 2022-01-04 Snap Inc. 3D cutout image modification
US11775165B2 (en) 2020-03-16 2023-10-03 Snap Inc. 3D cutout image modification
US11625873B2 (en) 2020-03-30 2023-04-11 Snap Inc. Personalized media overlay recommendation
US11818286B2 (en) 2020-03-30 2023-11-14 Snap Inc. Avatar recommendation and reply
US11543939B2 (en) 2020-06-08 2023-01-03 Snap Inc. Encoded image based messaging system
US11822766B2 (en) 2020-06-08 2023-11-21 Snap Inc. Encoded image based messaging system
US11922010B2 (en) 2020-06-08 2024-03-05 Snap Inc. Providing contextual information with keyboard interface for messaging system
US11683280B2 (en) 2020-06-10 2023-06-20 Snap Inc. Messaging system including an external-resource dock and drawer
US11580682B1 (en) 2020-06-30 2023-02-14 Snap Inc. Messaging system with augmented reality makeup
US11863513B2 (en) 2020-08-31 2024-01-02 Snap Inc. Media content playback and comments management
US11893301B2 (en) 2020-09-10 2024-02-06 Snap Inc. Colocated shared augmented reality without shared backend
US11360733B2 (en) 2020-09-10 2022-06-14 Snap Inc. Colocated shared augmented reality without shared backend
US11956190B2 (en) 2020-09-11 2024-04-09 Snap Inc. Messaging system with a carousel of related entities
US11452939B2 (en) 2020-09-21 2022-09-27 Snap Inc. Graphical marker generation system for synchronizing users
US11833427B2 (en) 2020-09-21 2023-12-05 Snap Inc. Graphical marker generation system for synchronizing users
US11888795B2 (en) 2020-09-21 2024-01-30 Snap Inc. Chats with micro sound clips
US11910269B2 (en) 2020-09-25 2024-02-20 Snap Inc. Augmented reality content items including user avatar to share location
US11660022B2 (en) 2020-10-27 2023-05-30 Snap Inc. Adaptive skeletal joint smoothing
US11615592B2 (en) 2020-10-27 2023-03-28 Snap Inc. Side-by-side character animation from realtime 3D body motion capture
US11734894B2 (en) 2020-11-18 2023-08-22 Snap Inc. Real-time motion transfer for prosthetic limbs
US11748931B2 (en) 2020-11-18 2023-09-05 Snap Inc. Body animation sharing and remixing
US11450051B2 (en) 2020-11-18 2022-09-20 Snap Inc. Personalized avatar real-time motion capture
US11954723B2 (en) 2020-12-17 2024-04-09 Ebay Inc. Replaced device handler
US11924153B2 (en) * 2020-12-31 2024-03-05 Snap Inc. Messaging user interface element with reminders
US20220210107A1 (en) * 2020-12-31 2022-06-30 Snap Inc. Messaging user interface element with reminders
US11790531B2 (en) 2021-02-24 2023-10-17 Snap Inc. Whole body segmentation
US11908243B2 (en) 2021-03-16 2024-02-20 Snap Inc. Menu hierarchy navigation on electronic mirroring devices
US11798201B2 (en) 2021-03-16 2023-10-24 Snap Inc. Mirroring device with whole-body outfits
US11809633B2 (en) 2021-03-16 2023-11-07 Snap Inc. Mirroring device with pointing based navigation
US11734959B2 (en) 2021-03-16 2023-08-22 Snap Inc. Activating hands-free mode on mirroring device
US11544885B2 (en) 2021-03-19 2023-01-03 Snap Inc. Augmented reality experience based on physical items
US11562548B2 (en) 2021-03-22 2023-01-24 Snap Inc. True size eyewear in real time
US11941767B2 (en) 2021-05-19 2024-03-26 Snap Inc. AR-based connected portal shopping
US11636654B2 (en) 2021-05-19 2023-04-25 Snap Inc. AR-based connected portal shopping
US11941227B2 (en) 2021-06-30 2024-03-26 Snap Inc. Hybrid search system for customizable media
US11854069B2 (en) 2021-07-16 2023-12-26 Snap Inc. Personalized try-on ads
US11908083B2 (en) 2021-08-31 2024-02-20 Snap Inc. Deforming custom mesh based on body mesh
US11670059B2 (en) 2021-09-01 2023-06-06 Snap Inc. Controlling interactive fashion based on body gestures
US11673054B2 (en) 2021-09-07 2023-06-13 Snap Inc. Controlling AR games on fashion items
US11663792B2 (en) 2021-09-08 2023-05-30 Snap Inc. Body fitted accessory with physics simulation
US11900506B2 (en) 2021-09-09 2024-02-13 Snap Inc. Controlling interactive fashion based on facial expressions
US11734866B2 (en) 2021-09-13 2023-08-22 Snap Inc. Controlling interactive fashion based on voice
US11798238B2 (en) 2021-09-14 2023-10-24 Snap Inc. Blending body mesh into external mesh
US11836866B2 (en) 2021-09-20 2023-12-05 Snap Inc. Deforming real-world object using an external mesh
US11636662B2 (en) 2021-09-30 2023-04-25 Snap Inc. Body normal network light and rendering control
US11651572B2 (en) 2021-10-11 2023-05-16 Snap Inc. Light and rendering of garments
US11836862B2 (en) 2021-10-11 2023-12-05 Snap Inc. External mesh with vertex attributes
US11790614B2 (en) 2021-10-11 2023-10-17 Snap Inc. Inferring intent from pose and speech input
US11763481B2 (en) 2021-10-20 2023-09-19 Snap Inc. Mirror-based augmented reality experience
US11748958B2 (en) 2021-12-07 2023-09-05 Snap Inc. Augmented reality unboxing experience
US11960784B2 (en) 2021-12-07 2024-04-16 Snap Inc. Shared augmented reality unboxing experience
US11880947B2 (en) 2021-12-21 2024-01-23 Snap Inc. Real-time upper-body garment exchange
US11928783B2 (en) 2021-12-30 2024-03-12 Snap Inc. AR position and orientation along a plane
US11887260B2 (en) 2021-12-30 2024-01-30 Snap Inc. AR position indicator
US11823346B2 (en) 2022-01-17 2023-11-21 Snap Inc. AR body part tracking system
US11954762B2 (en) 2022-01-19 2024-04-09 Snap Inc. Object replacement system
US11870745B1 (en) 2022-06-28 2024-01-09 Snap Inc. Media gallery sharing and management
US11962598B2 (en) 2022-08-10 2024-04-16 Snap Inc. Social media post subscribe requests for buffer user accounts
US11956192B2 (en) 2022-10-12 2024-04-09 Snap Inc. Message reminder interface
US11893166B1 (en) 2022-11-08 2024-02-06 Snap Inc. User avatar movement control using an augmented reality eyewear device

Also Published As

Publication number Publication date
CN105554311A (en) 2016-05-04
KR20110014224A (en) 2011-02-10
EP2294802A1 (en) 2011-03-16
CN102037716A (en) 2011-04-27
WO2009146250A1 (en) 2009-12-03
JP2014059894A (en) 2014-04-03
JP2011523486A (en) 2011-08-11
JP5497015B2 (en) 2014-05-21

Similar Documents

Publication Publication Date Title
US20090300525A1 (en) Method and system for automatically updating avatar to indicate user&#39;s status
US10909639B2 (en) Acceleration of social interactions
AU2017240907B2 (en) Sharing updatable graphical user interface elements
US9686812B2 (en) System and method for wireless device pairing
US8892171B2 (en) System and method for user profiling from gathering user data through interaction with a wireless communication device
KR101774120B1 (en) Multi-activity platform and interface
US9002331B2 (en) System and method for managing mobile communications
US9564025B1 (en) Systems and methods for indicating a user state in a social network
RU2610944C2 (en) History log of users activities and associated emotional states
US20060052136A1 (en) Applications of broadband media and position sensing phones
US20080113675A1 (en) Applications of broadband media and position sensing phones
US20170080346A1 (en) Methods and systems relating to personalized evolving avatars
US20100022279A1 (en) Mood dependent alert signals in communication devices
WO2016077177A1 (en) Social cuing based on in-context observation
JP2002300296A (en) Portable communication terminal with affinity diagnostic function, control method for the same, program for the method and recording medium with the program recorded thereon
US20230041497A1 (en) Mood oriented workspace
CN115253272A (en) Game interaction method and device, storage medium and electronic equipment
CN115808886A (en) Equipment control method, device, equipment, system and storage medium
JP2017126346A (en) Information providing device, system, and program
JP2017126305A (en) Information providing device, system, and program
KR20130012208A (en) Method for providing a social network service based on phone numbers

Legal Events

Date Code Title Description
AS Assignment

Owner name: QUALCOMM INCORPORATED,CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROMERA JOLLIFF, MARIA ELENA;HORODEZKY, SAMUEL JACOB;CHUNG, TIA;AND OTHERS;SIGNING DATES FROM 20080613 TO 20080623;REEL/FRAME:021186/0582

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION