US20080242288A1 - Centralized voice recognition unit for wireless control of personal mobile electronic devices - Google Patents

Centralized voice recognition unit for wireless control of personal mobile electronic devices Download PDF

Info

Publication number
US20080242288A1
US20080242288A1 US12/136,355 US13635508A US2008242288A1 US 20080242288 A1 US20080242288 A1 US 20080242288A1 US 13635508 A US13635508 A US 13635508A US 2008242288 A1 US2008242288 A1 US 2008242288A1
Authority
US
United States
Prior art keywords
command
personal mobile
mobile electronic
user
voice recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/136,355
Inventor
Thomas M. Guyette
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nuance Communications Inc
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US12/136,355 priority Critical patent/US20080242288A1/en
Assigned to INTERNATIONAL BUSINESS MACHINES CORPORATION reassignment INTERNATIONAL BUSINESS MACHINES CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GUYETTE, THOMAS M
Publication of US20080242288A1 publication Critical patent/US20080242288A1/en
Assigned to NUANCE COMMUNICATIONS, INC. reassignment NUANCE COMMUNICATIONS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTERNATIONAL BUSINESS MACHINES CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/04Supports for telephone transmitters or receivers
    • H04M1/05Supports for telephone transmitters or receivers specially adapted for use on head, throat or breast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/74Details of telephonic subscriber devices with voice recognition means

Definitions

  • the invention relates generally to voice recognition for control and monitoring of personal mobile electronic devices.
  • the invention relates to a system for controlling and monitoring the operation of multiple personal mobile electronic devices worn by a user using voice commands.
  • a personal mobile device can be transported by the user by attaching the device to clothing or articles worn by the user or carrying the device within the clothing.
  • some personal mobile devices can be carried in a shirt pocket, clipped to a belt, or otherwise attached to or included in apparel or other items worn by the user.
  • Other personal mobile devices can be secured to the body using a wristband, armband, headset clip and the like.
  • User interaction with personal mobile devices can be improved by providing voice recognition capability.
  • the user speaks a voice command which is recognized by the device and results in the activation of a particular device feature or operation.
  • Devices that provide voice recognition capability are typically larger in size than similar devices without such capabilities due to the size of the embedded microphone and voice recognition unit. Consequently, it can be impractical for a user to carry multiple personal mobile devices each having its own voice recognition capability. Furthermore, the cost to the user for including a microphone and voice recognition unit in each device can be substantial.
  • a personal mobile electronic device can limit its ability to receive voice commands without distortion.
  • a device not disposed in the front hemisphere about a user's head i.e., near the mouth
  • a device secured to a belt or disposed in a pocket may receive muffled or distorted speech.
  • background noise can degrade the ability of the device to interpret voice commands spoken by the user.
  • the device can have difficulty differentiating between casual conversation in the same area as the user, and the user himself. If the device does not recognize a voice command, the device may not respond or an improper device feature or operation may be activated, causing delays, confusing operations, and causing user frustration.
  • the invention features a system for controlling the operation of a first personal mobile electronic device disposed proximate to a user and adapted to receive commands in accordance with a first command set and for controlling the operation of a second personal mobile electronic device disposed proximate to the user and adapted to receive commands in accordance with a second command set.
  • the first and second personal mobile electronic devices are capable of operating independent of a control module.
  • the system includes a microphone to generate a voice signal responsive to a voice command spoken by the user and a wearable command module.
  • the wearable command module includes a voice recognition unit in communication with the microphone.
  • the voice recognition unit generates command data from the first and second command sets in response to the voice signal.
  • the wearable command module also includes a wireless data transmitter that is in electrical communication with the voice recognition unit.
  • the wireless data transmitter transmits the command data from the first and second command sets encoded according to a wireless protocol to control an operation of the first and second personal mobile electronic devices, respectively.
  • FIG. 1 illustrates a person wearing a variety of personal mobile devices.
  • FIG. 2 is a block diagram of an embodiment of a system for controlling the operation of personal mobile devices disposed proximate to a user in accordance with the invention.
  • FIG. 3 is a block diagram of another embodiment of a system for controlling the operation of personal mobile devices disposed proximate to a user in accordance with the invention.
  • FIG. 4 is a flowchart representation of an embodiment of a method for controlling the operation of personal mobile devices disposed proximate to a user in accordance with the invention.
  • the present invention relates to a system and method for controlling the operation of personal mobile electronic devices disposed proximate to a user.
  • the system includes a microphone to generate a voice signal in response to a voice command spoken by a user.
  • the system also includes a wearable command module that includes a voice recognition unit and a wireless data transmitter.
  • the voice recognition unit generates command data for one of the devices in response to the voice signal.
  • the transmitter sends the command data encoded according to a wireless protocol to control the operation of the personal mobile electronic device.
  • FIG. 1 shows an example of a user 10 carrying a variety of personal mobile electronic devices 14 A to 14 E (generally 14 ) about the body.
  • the devices 14 can be, for example, cellular telephones and other mobile communication devices, personal digital assistants (PDAs), handheld computers and control devices for remote systems and equipment.
  • PDAs personal digital assistants
  • One device 14 A is shown protruding from a shirt pocket 18 and another device 14 B is clipped to a belt 22 .
  • Other devices 14 C and 14 D are attached to a hat 26 and a shoe 30 , respectively.
  • Additional devices 14 E (not directly visible but depicted by a dashed line) and 14 F are disposed in a pants pocket 34 and attached to the user's wrist, respectively.
  • Other arrangements are of course possible, and limited only by the imagination of the device designers.
  • each device 14 individually provides voice command capability, the size and weight due to the embedded microphones and voice recognition units can limit the mobility of the user 10 . Moreover, to operate one of the devices 14 using voice commands, the user 10 generally has to unfasten or otherwise retrieve the device 14 from its position about the body and to hold the device 14 near the user's mouth to provide for clear voice command reception.
  • a single microphone and a wearable command module are used to facilitate voice command operation of the personal mobile devices 14 .
  • the wearable command module includes a voice recognition unit to receive voice signals that are generated by the microphone in response to voice commands spoken by the user 10 .
  • the voice recognition unit generates command data for the device 14 specified in the voice command.
  • a wireless data transmitter included in the wearable command module receives the command data from the voice recognition unit and encodes the command data according to a wireless protocol (e.g., BluetoothTM).
  • the encoded command data is transmitted to the device 14 and causes the performance of the desired device function or operation.
  • a single microphone which can be positioned in the front hemisphere about the user's head where speech is clearest receives the voice commands for multiple devices 14 .
  • the need to include a voice recognition unit in each device 14 is avoided, thus the devices 14 are generally smaller, lighter and less costly than would otherwise be possible.
  • an embodiment of a system 38 for controlling the operation of personal mobile electronic devices 14 disposed on or proximate to a user includes a microphone 42 , a voice recognition unit 46 and a wireless transmitter 50 .
  • the microphone 42 is configurable for positioning near the mouth of the user.
  • the microphone 42 can be provided on an adjustable boom of a headset worn by the user.
  • the voice recognition unit 46 and wireless transmitter 50 are conveniently integrated as a single wearable command module 54 that can be worn by the user in a location separate from the microphone 42 .
  • the wearable command module 54 can be placed in a shirt pocket or secured to a belt.
  • the microphone 42 and wearable command module 54 are combined as a single module or article (e.g., a headset) that can be worn by the user in a position where clear speech can be received.
  • the microphone 42 and wearable command module 54 can be a portable stand-alone module or can be integrated into a personal mobile electronic device 14 .
  • the wearable command module 54 can be integrated into a “master device” that may have a primary use but can also be used to coordinate communications with all the user devices 14 .
  • a master device can be a cellular telephone, PDA, portable computer system (e.g., handheld computer, laptop computer) and the like.
  • the microphone 42 In response to a received voice command, the microphone 42 generates a voice signal and transmits the signal to the voice recognition unit 46 over a communication link 58 .
  • the communication link 58 includes an electrically conductive path.
  • the microphone 42 can be part of a wireless headset and the communication link 58 is a wireless link that utilizes any of a variety of wireless protocols.
  • the single microphone 42 , centralized voice recognition unit 46 and wireless transmitter 50 enable the user to address all the personal mobile devices 14 worn on or disposed near to the user.
  • Each voice command spoken by the user includes the programmed name of the device 14 to be controlled.
  • the devices 14 are capable of receiving and interpreting data encoded according to the protocol of the wireless transmitter 50 . Encoded data can include text, extensible markup language (XML) or other formatted commands generally including only small quantities of data. No additional external voice translation software or hardware is necessary.
  • the command set used by the system 38 is extensible. Any device 14 that communicates using the same protocol as the wireless transmitter 50 can be controlled. The user is not required to program the command set in advance or to perform mapping tasks to match voice commands to actions that can be performed by the controlled device, although such re-mapping functionality can be available and may be desirable by users.
  • bidirectional communication occurs so that a command set understood by the device 14 is shared with the voice recognition unit 46 . Subsequently, voice commands spoken by the user are translated into commands the device 14 understands.
  • the wearable command module 54 can provide a “find devices” function to enable the user to browse and select controllable devices 14 for initialization. Upon selection, the wearable command module 54 communicates with the selected device using a data protocol that allows the device to indicate what functions it can perform, including a “name now” function. When the name now function is selected, the device replies with “speak my name now” to the wearable command module 54 . The user then speaks the desired name for the device and the device replies to indicate it understands the spoken name. For example, the device can reply “I am now ⁇ NAME>.” Devices can maintain their own command sets which include standard commands such as “set name”, “turn off”, “stop listening” and the like.
  • FIG. 3 is a block diagram of another embodiment of a system 62 for controlling the operation of personal mobile electronic devices 14 disposed on or proximate to a user.
  • the system 62 includes the components of the system 38 of FIG. 2 and additionally includes a wireless receiver 66 and a speaker 70 .
  • FIG. 4 is a flowchart depiction of an embodiment of a method 100 for controlling the operation of personal mobile electronic devices disposed on a user. Referring to FIG. 3 and FIG. 4 , a user speaks a voice command (step 110 ) to cause a particular device 14 to execute a desired device feature or operation.
  • Voice commands in one example are structured as:
  • Device Name is the name assigned by the user to the device 14 to be controlled
  • Command is the device feature or operation to be executed
  • Command Parameters are parameters used to define the scope of the operation.
  • the user controls a public address (PA) system by saying “PA system: Tell the audience, license number 123 456 your lights are on”.
  • PA public address
  • the phrase “License number 123 456 your lights are on” is broadcast from speakers in the PA system.
  • the user says “Clock: Ring at, six-thirty AM Tuesday” to program a clock to sound an alarm.
  • Some voice commands do not require any content for Command Parameters.
  • the user can control a remote control device for a garage door by saying either “Garage door: Open” or “Garage door: Close”.
  • a voice signal responsive to the voice command is generated (step 120 ) by the microphone 42 and provided to the voice recognition unit 46 which responds by generating (step 130 ) command data.
  • the command data is encoded and transmitted to the device (step 140 ) according to a wireless protocol such as the Bluetooth wireless protocol. Although all the devices 14 within the operating range of the wireless transmitter 50 receive the encoded command data, only the device 14 specified by its programmed name in the encoded command data responds to the command and activates (step 150 ) the desired device feature or operation.
  • a reply signal is generated (step 160 ) to provide feedback information to the user such as device status.
  • the reply signal is received by the wireless receiver 66 and provided to a device status indicator 70 .
  • the device status indicator 70 is a speaker.
  • the device status indicator 70 and microphone 42 are integrated as a single module or provided as components of a common article such as a headset or other form of headgear worn by the user.
  • the wireless receiver 66 is integrated into the wearable command module 54 and communications between the wireless receiver 66 and the device status indicator 70 are provided using the same communications link 58 used by the microphone 42 and voice recognition unit 46 .
  • the reply signal causes an audible confirmation such as a spoken “OK” or a ding to be generated (step 170 ) by the speaker 70 .
  • the reply signal causes an audible error indication such as a spoken “Error” or a buzz to be generated (step 170 ) by the speaker 70 .
  • the reply signal causes the speaker 70 to generate (step 170 ) an audible message to the user such as a spoken “Please say that again” or a tone to indicate that the command is not understood.
  • the device status indicator 70 can present other forms of feedback information to the user.
  • the device status indicator 70 can include one or more optical sources (e.g., light emitting diodes (LEDs)) that emit light of a particular color according to the state of the reply signal.
  • the optical source can be modulated to “blink” according to various sequences to provide the status to the user.
  • the device status indicator 70 is a liquid crystal display (LCD) for presenting text or graphics to the user.
  • LCD liquid crystal display

Abstract

Described are a method and system for controlling the operation of personal mobile electronic devices disposed about a user. Each device is adapted to receive command according to a common wireless protocol. A voice signal is generated in response to a voice command spoken by the user. Command data are generated for one of the devices in response to the voice signal. The command data are transmitted through a wireless link to the respective device to control an operation of the device. There is no need for each device to have independent voice recognition capability. Instead, the burden of voice recognition is managed by the wearable command module that communicates with all the controlled devices through wireless links.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a continuation application claiming the benefit of the filing date of co-pending U.S. patent application Ser. No. 11/225,616, filed Sep. 13, 2005, titled “Centralized Voice Recognition Unit for Wireless Control of Personal Mobile Electronic Devices,” the entirety of which U.S. patent application is incorporated by reference herein.
  • FIELD OF THE INVENTION
  • The invention relates generally to voice recognition for control and monitoring of personal mobile electronic devices. In particular, the invention relates to a system for controlling and monitoring the operation of multiple personal mobile electronic devices worn by a user using voice commands.
  • BACKGROUND OF THE INVENTION
  • The use of personal mobile electronic devices that can be carried on a person has increased dramatically in recent years. Such devices include cellular telephones, personal digital assistants (PDAs), handheld computers, broadcast systems, music and video playback devices, sophisticated digital watches and calculators. A personal mobile device can be transported by the user by attaching the device to clothing or articles worn by the user or carrying the device within the clothing. For example, some personal mobile devices can be carried in a shirt pocket, clipped to a belt, or otherwise attached to or included in apparel or other items worn by the user. Other personal mobile devices can be secured to the body using a wristband, armband, headset clip and the like.
  • User interaction with personal mobile devices can be improved by providing voice recognition capability. The user speaks a voice command which is recognized by the device and results in the activation of a particular device feature or operation. Devices that provide voice recognition capability are typically larger in size than similar devices without such capabilities due to the size of the embedded microphone and voice recognition unit. Consequently, it can be impractical for a user to carry multiple personal mobile devices each having its own voice recognition capability. Furthermore, the cost to the user for including a microphone and voice recognition unit in each device can be substantial.
  • The location of a personal mobile electronic device about the body can limit its ability to receive voice commands without distortion. For instance, a device not disposed in the front hemisphere about a user's head (i.e., near the mouth), such as a device secured to a belt or disposed in a pocket, may receive muffled or distorted speech. Moreover, as the distance between the device and the user's mouth increases, background noise can degrade the ability of the device to interpret voice commands spoken by the user. For example, the device can have difficulty differentiating between casual conversation in the same area as the user, and the user himself. If the device does not recognize a voice command, the device may not respond or an improper device feature or operation may be activated, causing delays, confusing operations, and causing user frustration.
  • What is needed are a method and system that allow a user to control personal mobile electronic devices disposed proximate to the body that are not subject to the problems described above. The present invention satisfies this need and provides additional advantages.
  • SUMMARY OF THE INVENTION
  • In one aspect, the invention features a system for controlling the operation of a first personal mobile electronic device disposed proximate to a user and adapted to receive commands in accordance with a first command set and for controlling the operation of a second personal mobile electronic device disposed proximate to the user and adapted to receive commands in accordance with a second command set. The first and second personal mobile electronic devices are capable of operating independent of a control module. The system includes a microphone to generate a voice signal responsive to a voice command spoken by the user and a wearable command module. The wearable command module includes a voice recognition unit in communication with the microphone. The voice recognition unit generates command data from the first and second command sets in response to the voice signal. The wearable command module also includes a wireless data transmitter that is in electrical communication with the voice recognition unit. The wireless data transmitter transmits the command data from the first and second command sets encoded according to a wireless protocol to control an operation of the first and second personal mobile electronic devices, respectively.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and further advantages of this invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like numerals indicate like structural elements and features in the various figures. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
  • FIG. 1 illustrates a person wearing a variety of personal mobile devices.
  • FIG. 2 is a block diagram of an embodiment of a system for controlling the operation of personal mobile devices disposed proximate to a user in accordance with the invention.
  • FIG. 3 is a block diagram of another embodiment of a system for controlling the operation of personal mobile devices disposed proximate to a user in accordance with the invention.
  • FIG. 4 is a flowchart representation of an embodiment of a method for controlling the operation of personal mobile devices disposed proximate to a user in accordance with the invention.
  • DETAILED DESCRIPTION
  • In brief overview, the present invention relates to a system and method for controlling the operation of personal mobile electronic devices disposed proximate to a user. The system includes a microphone to generate a voice signal in response to a voice command spoken by a user. The system also includes a wearable command module that includes a voice recognition unit and a wireless data transmitter. The voice recognition unit generates command data for one of the devices in response to the voice signal. The transmitter sends the command data encoded according to a wireless protocol to control the operation of the personal mobile electronic device. Advantageously, there is no requirement for each device to have independent voice recognition capability. Instead, the burden of voice recognition is managed by the single wearable command module that communicates with all the controlled devices through a wireless link. Consequently, manufacturing of the devices is simplified and the size, weight and cost of the devices are decreased.
  • FIG. 1 shows an example of a user 10 carrying a variety of personal mobile electronic devices 14A to 14E (generally 14) about the body. The devices 14 can be, for example, cellular telephones and other mobile communication devices, personal digital assistants (PDAs), handheld computers and control devices for remote systems and equipment. One device 14A is shown protruding from a shirt pocket 18 and another device 14B is clipped to a belt 22. Other devices 14C and 14D are attached to a hat 26 and a shoe 30, respectively. Additional devices 14E (not directly visible but depicted by a dashed line) and 14F are disposed in a pants pocket 34 and attached to the user's wrist, respectively. Other arrangements are of course possible, and limited only by the imagination of the device designers.
  • If each device 14 individually provides voice command capability, the size and weight due to the embedded microphones and voice recognition units can limit the mobility of the user 10. Moreover, to operate one of the devices 14 using voice commands, the user 10 generally has to unfasten or otherwise retrieve the device 14 from its position about the body and to hold the device 14 near the user's mouth to provide for clear voice command reception.
  • In an embodiment of a system for controlling the operation of personal mobile devices disposed on a user according to the invention, a single microphone and a wearable command module are used to facilitate voice command operation of the personal mobile devices 14. The wearable command module includes a voice recognition unit to receive voice signals that are generated by the microphone in response to voice commands spoken by the user 10. The voice recognition unit generates command data for the device 14 specified in the voice command. A wireless data transmitter included in the wearable command module receives the command data from the voice recognition unit and encodes the command data according to a wireless protocol (e.g., Bluetooth™). The encoded command data is transmitted to the device 14 and causes the performance of the desired device function or operation. Advantageously, a single microphone which can be positioned in the front hemisphere about the user's head where speech is clearest receives the voice commands for multiple devices 14. In addition, the need to include a voice recognition unit in each device 14 is avoided, thus the devices 14 are generally smaller, lighter and less costly than would otherwise be possible.
  • Referring to the block diagram of FIG. 2, an embodiment of a system 38 for controlling the operation of personal mobile electronic devices 14 disposed on or proximate to a user includes a microphone 42, a voice recognition unit 46 and a wireless transmitter 50. The microphone 42 is configurable for positioning near the mouth of the user. For example, the microphone 42 can be provided on an adjustable boom of a headset worn by the user. As illustrated, the voice recognition unit 46 and wireless transmitter 50 are conveniently integrated as a single wearable command module 54 that can be worn by the user in a location separate from the microphone 42. For example, the wearable command module 54 can be placed in a shirt pocket or secured to a belt. In an alternative embodiment, the microphone 42 and wearable command module 54 are combined as a single module or article (e.g., a headset) that can be worn by the user in a position where clear speech can be received.
  • The microphone 42 and wearable command module 54 can be a portable stand-alone module or can be integrated into a personal mobile electronic device 14. By way of example, the wearable command module 54 can be integrated into a “master device” that may have a primary use but can also be used to coordinate communications with all the user devices 14. For example, a master device can be a cellular telephone, PDA, portable computer system (e.g., handheld computer, laptop computer) and the like.
  • In response to a received voice command, the microphone 42 generates a voice signal and transmits the signal to the voice recognition unit 46 over a communication link 58. If the microphone 42 is provided in a corded headset, the communication link 58 includes an electrically conductive path. Alternatively, the microphone 42 can be part of a wireless headset and the communication link 58 is a wireless link that utilizes any of a variety of wireless protocols.
  • The single microphone 42, centralized voice recognition unit 46 and wireless transmitter 50 enable the user to address all the personal mobile devices 14 worn on or disposed near to the user. Each voice command spoken by the user includes the programmed name of the device 14 to be controlled. The devices 14 are capable of receiving and interpreting data encoded according to the protocol of the wireless transmitter 50. Encoded data can include text, extensible markup language (XML) or other formatted commands generally including only small quantities of data. No additional external voice translation software or hardware is necessary.
  • The command set used by the system 38 is extensible. Any device 14 that communicates using the same protocol as the wireless transmitter 50 can be controlled. The user is not required to program the command set in advance or to perform mapping tasks to match voice commands to actions that can be performed by the controlled device, although such re-mapping functionality can be available and may be desirable by users. During a handshake phase, bidirectional communication occurs so that a command set understood by the device 14 is shared with the voice recognition unit 46. Subsequently, voice commands spoken by the user are translated into commands the device 14 understands.
  • As an example of the naming of devices 14, the wearable command module 54 can provide a “find devices” function to enable the user to browse and select controllable devices 14 for initialization. Upon selection, the wearable command module 54 communicates with the selected device using a data protocol that allows the device to indicate what functions it can perform, including a “name now” function. When the name now function is selected, the device replies with “speak my name now” to the wearable command module 54. The user then speaks the desired name for the device and the device replies to indicate it understands the spoken name. For example, the device can reply “I am now <NAME>.” Devices can maintain their own command sets which include standard commands such as “set name”, “turn off”, “stop listening” and the like.
  • FIG. 3 is a block diagram of another embodiment of a system 62 for controlling the operation of personal mobile electronic devices 14 disposed on or proximate to a user. The system 62 includes the components of the system 38 of FIG. 2 and additionally includes a wireless receiver 66 and a speaker 70. FIG. 4 is a flowchart depiction of an embodiment of a method 100 for controlling the operation of personal mobile electronic devices disposed on a user. Referring to FIG. 3 and FIG. 4, a user speaks a voice command (step 110) to cause a particular device 14 to execute a desired device feature or operation.
  • Voice commands in one example are structured as:

  • <Device Name>:<Command>,<Command Parameters>
  • where Device Name is the name assigned by the user to the device 14 to be controlled, Command is the device feature or operation to be executed and Command Parameters are parameters used to define the scope of the operation. For example, the user controls a public address (PA) system by saying “PA system: Tell the audience, license number 123 456 your lights are on”. Thus the phrase “License number 123 456 your lights are on” is broadcast from speakers in the PA system. In another example, the user says “Clock: Ring at, six-thirty AM Tuesday” to program a clock to sound an alarm. Some voice commands do not require any content for Command Parameters. For example, the user can control a remote control device for a garage door by saying either “Garage door: Open” or “Garage door: Close”.
  • A voice signal responsive to the voice command is generated (step 120) by the microphone 42 and provided to the voice recognition unit 46 which responds by generating (step 130) command data. The command data is encoded and transmitted to the device (step 140) according to a wireless protocol such as the Bluetooth wireless protocol. Although all the devices 14 within the operating range of the wireless transmitter 50 receive the encoded command data, only the device 14 specified by its programmed name in the encoded command data responds to the command and activates (step 150) the desired device feature or operation.
  • After receipt of the command data, if the controlled device 14 has optional command confirmation capability, a reply signal is generated (step 160) to provide feedback information to the user such as device status. The reply signal is received by the wireless receiver 66 and provided to a device status indicator 70. In the illustrated embodiment the device status indicator 70 is a speaker. Preferably the device status indicator 70 and microphone 42 are integrated as a single module or provided as components of a common article such as a headset or other form of headgear worn by the user. In one embodiment, the wireless receiver 66 is integrated into the wearable command module 54 and communications between the wireless receiver 66 and the device status indicator 70 are provided using the same communications link 58 used by the microphone 42 and voice recognition unit 46.
  • If the command is understood and properly executed by the device 14, the reply signal causes an audible confirmation such as a spoken “OK” or a ding to be generated (step 170) by the speaker 70. Conversely, if the device 14 fails to execute the desired device feature or operation, the reply signal causes an audible error indication such as a spoken “Error” or a buzz to be generated (step 170) by the speaker 70. If the device 14 cannot interpret the command data, the reply signal causes the speaker 70 to generate (step 170) an audible message to the user such as a spoken “Please say that again” or a tone to indicate that the command is not understood.
  • In other system embodiments, the device status indicator 70 can present other forms of feedback information to the user. For example, the device status indicator 70 can include one or more optical sources (e.g., light emitting diodes (LEDs)) that emit light of a particular color according to the state of the reply signal. Alternatively, the optical source can be modulated to “blink” according to various sequences to provide the status to the user. In another example, the device status indicator 70 is a liquid crystal display (LCD) for presenting text or graphics to the user.
  • While the invention has been shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention.

Claims (13)

1. A system for controlling the operation of a first personal mobile electronic device disposed proximate to a user and adapted to receive commands in accordance with a first command set and for controlling the operation of a second personal mobile electronic device disposed proximate to the user and adapted to receive commands in accordance with a second command set, the first and second personal mobile electronic devices being capable of operating independent of a control module, the system comprising:
a microphone to generate a voice signal responsive to a voice command spoken by the user; and
a wearable command module comprising;
a voice recognition unit in communication with the microphone, the voice recognition unit generating command data from the first and second command sets in response to the voice signal; and
a wireless data transmitter in electrical communication with the voice recognition unit and transmitting the command data from the first and second command sets encoded according to a wireless protocol to control an operation of the first and second personal mobile electronic devices, respectively.
2. The system of claim 1 wherein the voice command comprises a device name assigned to the one of the personal mobile electronic devices.
3. The system of claim 1 wherein the wearable command module is integrated in one of the personal mobile electronic devices.
4. The system of claim 1 wherein the microphone is in communication with the voice recognition unit through a wireless link.
5. The system of claim 1 wherein the microphone is in communication with the voice recognition module through an electrically conductive path.
6. The system of claim 1 wherein the microphone is integral with the wearable command module.
7. The system of claim 1 further comprising a device status indicator to indicate to the user a status of one of the personal mobile electronic devices.
8. The system of claim 7 wherein the device status indicator comprises a speaker.
9. The system of claim 7 wherein the device status indicator comprises an optical source.
10. The system of claim 7 wherein the device status indicator comprises an optical display.
11. The system of claim 1 wherein each of the first and second personal mobile electronic devices is addressable by a name and the encoded command data comprises the programmed name of the personal mobile device being controlled.
12. The system of claim 1 further comprising a personal mobile electronic device.
13. The system of claim 1 further comprising a wireless data receiver in electrical communication with the voice recognition unit and configured to receive status and feedback data encoded in accordance with the wireless protocol.
US12/136,355 2005-09-13 2008-06-10 Centralized voice recognition unit for wireless control of personal mobile electronic devices Abandoned US20080242288A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/136,355 US20080242288A1 (en) 2005-09-13 2008-06-10 Centralized voice recognition unit for wireless control of personal mobile electronic devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/225,616 US7418281B2 (en) 2005-09-13 2005-09-13 Centralized voice recognition unit for wireless control of personal mobile electronic devices
US12/136,355 US20080242288A1 (en) 2005-09-13 2008-06-10 Centralized voice recognition unit for wireless control of personal mobile electronic devices

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/225,616 Continuation US7418281B2 (en) 2005-09-13 2005-09-13 Centralized voice recognition unit for wireless control of personal mobile electronic devices

Publications (1)

Publication Number Publication Date
US20080242288A1 true US20080242288A1 (en) 2008-10-02

Family

ID=37855847

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/225,616 Active 2026-08-12 US7418281B2 (en) 2005-09-13 2005-09-13 Centralized voice recognition unit for wireless control of personal mobile electronic devices
US12/136,355 Abandoned US20080242288A1 (en) 2005-09-13 2008-06-10 Centralized voice recognition unit for wireless control of personal mobile electronic devices

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/225,616 Active 2026-08-12 US7418281B2 (en) 2005-09-13 2005-09-13 Centralized voice recognition unit for wireless control of personal mobile electronic devices

Country Status (1)

Country Link
US (2) US7418281B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9144028B2 (en) 2012-12-31 2015-09-22 Motorola Solutions, Inc. Method and apparatus for uplink power control in a wireless communication system
US9646610B2 (en) 2012-10-30 2017-05-09 Motorola Solutions, Inc. Method and apparatus for activating a particular wireless communication device to accept speech and/or voice commands using identification data consisting of speech, voice, image recognition
US9854032B2 (en) 2016-02-05 2017-12-26 International Business Machines Corporation Context-aware task offloading among multiple devices
WO2019013349A1 (en) * 2017-07-14 2019-01-17 ダイキン工業株式会社 Air conditioner, air-conditioning system, communication system, control system, machinery control system, machinery management system, and sound information analysis system
US10484484B2 (en) 2016-02-05 2019-11-19 International Business Machines Corporation Context-aware task processing for multiple devices

Families Citing this family (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8677377B2 (en) 2005-09-08 2014-03-18 Apple Inc. Method and apparatus for building an intelligent automated assistant
US7418281B2 (en) * 2005-09-13 2008-08-26 International Business Machines Corporation Centralized voice recognition unit for wireless control of personal mobile electronic devices
US7769412B1 (en) * 2006-04-19 2010-08-03 Sprint Communications Company L.P. Wearable wireless telecommunications systems
US8626586B1 (en) 2006-06-23 2014-01-07 Sprint Communications Company L.P. Coordinated advertising for multiple wearable advertising display systems
US7715873B1 (en) 2006-06-23 2010-05-11 Sprint Communications Company L.P. Wearable accessories providing visual indicia of incoming events for wireless telecommunications device
US20080037727A1 (en) * 2006-07-13 2008-02-14 Clas Sivertsen Audio appliance with speech recognition, voice command control, and speech generation
US9318108B2 (en) 2010-01-18 2016-04-19 Apple Inc. Intelligent automated assistant
US7747446B2 (en) * 2006-12-12 2010-06-29 Nuance Communications, Inc. Voice recognition interactive system with a confirmation capability
US8996376B2 (en) 2008-04-05 2015-03-31 Apple Inc. Intelligent text-to-speech conversion
US10241752B2 (en) 2011-09-30 2019-03-26 Apple Inc. Interface for a virtual digital assistant
US10241644B2 (en) 2011-06-03 2019-03-26 Apple Inc. Actionable reminder entries
US8682667B2 (en) 2010-02-25 2014-03-25 Apple Inc. User profiling for selecting user specific voice input processing information
US9721563B2 (en) 2012-06-08 2017-08-01 Apple Inc. Name recognition system
US9794701B2 (en) 2012-08-31 2017-10-17 Starkey Laboratories, Inc. Gateway for a wireless hearing assistance device
US9547647B2 (en) 2012-09-19 2017-01-17 Apple Inc. Voice-based media searching
WO2014197334A2 (en) 2013-06-07 2014-12-11 Apple Inc. System and method for user-specified pronunciation of words for speech synthesis and recognition
JP6329833B2 (en) * 2013-10-04 2018-05-23 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Wearable terminal and method for controlling wearable terminal
US20150100313A1 (en) * 2013-10-09 2015-04-09 Verizon Patent And Licensing, Inc. Personification of computing devices for remote access
CN103558916A (en) * 2013-11-07 2014-02-05 百度在线网络技术(北京)有限公司 Man-machine interaction system, method and device
US11138971B2 (en) 2013-12-05 2021-10-05 Lenovo (Singapore) Pte. Ltd. Using context to interpret natural language speech recognition commands
CN106031255B (en) * 2014-02-21 2020-10-09 索尼公司 Communication control device, communication control method, and program
KR101573138B1 (en) * 2014-04-04 2015-12-01 삼성전자주식회사 Method and apparatus for measuring user physical activity
US10276154B2 (en) * 2014-04-23 2019-04-30 Lenovo (Singapore) Pte. Ltd. Processing natural language user inputs using context data
US9633004B2 (en) 2014-05-30 2017-04-25 Apple Inc. Better resolution when referencing to concepts
US9338493B2 (en) 2014-06-30 2016-05-10 Apple Inc. Intelligent automated assistant for TV user interactions
WO2016018057A1 (en) 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Method and device for providing function of mobile terminal
US9668121B2 (en) 2014-09-30 2017-05-30 Apple Inc. Social reminders
US10567477B2 (en) 2015-03-08 2020-02-18 Apple Inc. Virtual assistant continuity
US9578173B2 (en) 2015-06-05 2017-02-21 Apple Inc. Virtual assistant aided communication with 3rd party service in a communication session
US11025565B2 (en) 2015-06-07 2021-06-01 Apple Inc. Personalized prediction of responses for instant messaging
US20190007540A1 (en) * 2015-08-14 2019-01-03 Honeywell International Inc. Communication headset comprising wireless communication with personal protection equipment devices
US10747498B2 (en) 2015-09-08 2020-08-18 Apple Inc. Zero latency digital assistant
US10671428B2 (en) 2015-09-08 2020-06-02 Apple Inc. Distributed personal assistant
US9940928B2 (en) 2015-09-24 2018-04-10 Starkey Laboratories, Inc. Method and apparatus for using hearing assistance device as voice controller
US10366158B2 (en) 2015-09-29 2019-07-30 Apple Inc. Efficient word encoding for recurrent neural network language models
US11010550B2 (en) 2015-09-29 2021-05-18 Apple Inc. Unified language modeling framework for word prediction, auto-completion and auto-correction
US11587559B2 (en) * 2015-09-30 2023-02-21 Apple Inc. Intelligent device identification
CN106647292A (en) * 2015-10-30 2017-05-10 霍尼韦尔国际公司 Wearable gesture control device and method for intelligent household system
US10691473B2 (en) 2015-11-06 2020-06-23 Apple Inc. Intelligent automated assistant in a messaging environment
US10049668B2 (en) 2015-12-02 2018-08-14 Apple Inc. Applying neural network language models to weighted finite state transducers for automatic speech recognition
US10223066B2 (en) 2015-12-23 2019-03-05 Apple Inc. Proactive assistance based on dialog communication between devices
US10446143B2 (en) 2016-03-14 2019-10-15 Apple Inc. Identification of voice inputs providing credentials
US9934775B2 (en) 2016-05-26 2018-04-03 Apple Inc. Unit-selection text-to-speech synthesis based on predicted concatenation parameters
US9972304B2 (en) 2016-06-03 2018-05-15 Apple Inc. Privacy preserving distributed evaluation framework for embedded personalized systems
US10249300B2 (en) 2016-06-06 2019-04-02 Apple Inc. Intelligent list reading
US10049663B2 (en) 2016-06-08 2018-08-14 Apple, Inc. Intelligent automated assistant for media exploration
DK179309B1 (en) 2016-06-09 2018-04-23 Apple Inc Intelligent automated assistant in a home environment
US10192552B2 (en) 2016-06-10 2019-01-29 Apple Inc. Digital assistant providing whispered speech
US10586535B2 (en) 2016-06-10 2020-03-10 Apple Inc. Intelligent digital assistant in a multi-tasking environment
US10509862B2 (en) 2016-06-10 2019-12-17 Apple Inc. Dynamic phrase expansion of language input
US10067938B2 (en) 2016-06-10 2018-09-04 Apple Inc. Multilingual word prediction
US10490187B2 (en) 2016-06-10 2019-11-26 Apple Inc. Digital assistant providing automated status report
DK179343B1 (en) 2016-06-11 2018-05-14 Apple Inc Intelligent task discovery
DK179049B1 (en) 2016-06-11 2017-09-18 Apple Inc Data driven natural language event detection and classification
DK179415B1 (en) 2016-06-11 2018-06-14 Apple Inc Intelligent device arbitration and control
DK201670540A1 (en) 2016-06-11 2018-01-08 Apple Inc Application integration with a digital assistant
US10043516B2 (en) 2016-09-23 2018-08-07 Apple Inc. Intelligent automated assistant
US11281993B2 (en) 2016-12-05 2022-03-22 Apple Inc. Model and ensemble compression for metric learning
US10593346B2 (en) 2016-12-22 2020-03-17 Apple Inc. Rank-reduced token representation for automatic speech recognition
DK201770383A1 (en) 2017-05-09 2018-12-14 Apple Inc. User interface for correcting recognition errors
DK201770439A1 (en) 2017-05-11 2018-12-13 Apple Inc. Offline personal assistant
DK179745B1 (en) 2017-05-12 2019-05-01 Apple Inc. SYNCHRONIZATION AND TASK DELEGATION OF A DIGITAL ASSISTANT
DK201770428A1 (en) 2017-05-12 2019-02-18 Apple Inc. Low-latency intelligent automated assistant
DK179496B1 (en) 2017-05-12 2019-01-15 Apple Inc. USER-SPECIFIC Acoustic Models
DK201770431A1 (en) 2017-05-15 2018-12-20 Apple Inc. Optimizing dialogue policy decisions for digital assistants using implicit feedback
DK201770432A1 (en) 2017-05-15 2018-12-21 Apple Inc. Hierarchical belief states for digital assistants
DK179560B1 (en) 2017-05-16 2019-02-18 Apple Inc. Far-field extension for digital assistant services
US11005993B2 (en) 2017-07-14 2021-05-11 Google Llc Computational assistant extension device
WO2019017946A1 (en) * 2017-07-20 2019-01-24 Hewlett-Packard Development Company, L.P. Retaining apparatuses comprising connectors
CN107393535A (en) * 2017-08-29 2017-11-24 歌尔科技有限公司 A kind of method, apparatus, earphone and terminal for opening terminal speech identification function

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4665544A (en) * 1984-09-05 1987-05-12 Mitsubishi Denki Kabushiki Kaisha Home control system and interphone system
US5086385A (en) * 1989-01-31 1992-02-04 Custom Command Systems Expandable home automation system
US5237305A (en) * 1990-11-30 1993-08-17 Mitsubishi Denki Kabushiki Kaisha Home bus system
US5579221A (en) * 1993-12-31 1996-11-26 Samsung Electronics Co., Ltd. Home automation system having user controlled definition function
US5621662A (en) * 1994-02-15 1997-04-15 Intellinet, Inc. Home automation system
EP0772184A1 (en) * 1995-11-06 1997-05-07 THOMSON multimedia S.A. Vocal identification of devices in a home environment
US5657425A (en) * 1993-11-15 1997-08-12 International Business Machines Corporation Location dependent verbal command execution in a computer based control system
US5864481A (en) * 1996-01-22 1999-01-26 Raytheon Company Integrated, reconfigurable man-portable modular system
US5878394A (en) * 1994-04-21 1999-03-02 Info Byte Ag Process and device for the speech-controlled remote control of electrical consumers
US6088595A (en) * 1997-11-01 2000-07-11 Lucent Technologies Inc. Arrangement for configuring multiple portable units for communicating with each other in a frequency hopping system
US6167413A (en) * 2000-03-09 2000-12-26 Daley, Iii; Charles A. Wearable computer apparatus
US6208972B1 (en) * 1998-12-23 2001-03-27 Richard Grant Method for integrating computer processes with an interface controlled by voice actuated grammars
US6266995B1 (en) * 1999-05-20 2001-07-31 Respiratory Management Services, Inc. Portable medical gas system tester
US20020008625A1 (en) * 2000-02-29 2002-01-24 Adams Jonathan D. Remote accountability system and method
US6405261B1 (en) * 1997-11-26 2002-06-11 International Business Machines Corporation Method and apparatus for an automatic multi-rate wireless/wired computer network
US6443347B1 (en) * 2000-10-19 2002-09-03 International Business Machines Corporation Streamlined personal harness for supporting a wearable computer and associated equipment on the body of a user
US20020186180A1 (en) * 2000-11-30 2002-12-12 William Duda Hands free solar powered cap/visor integrated wireless multi-media apparatus
US6513063B1 (en) * 1999-01-05 2003-01-28 Sri International Accessing network-based electronic information through scripted online interfaces using spoken input
US6538623B1 (en) * 1999-05-13 2003-03-25 Pirooz Parnian Multi-media data collection tool kit having an electronic multi-media “case” file and method of use
US20030060202A1 (en) * 2001-08-28 2003-03-27 Roberts Robin U. System and method for enabling a radio node to selectably function as a router in a wireless communications network
US20030174049A1 (en) * 2002-03-18 2003-09-18 Precision Dynamics Corporation Wearable identification appliance that communicates with a wireless communications network such as bluetooth
US6654720B1 (en) * 2000-05-09 2003-11-25 International Business Machines Corporation Method and system for voice control enabling device in a service discovery network
US20030236821A1 (en) * 2002-06-05 2003-12-25 Goun-Zong Jiau Body wearable personal network server and system
US20040004547A1 (en) * 2002-05-17 2004-01-08 Fireeye Development Incorporated System and method for identifying, monitoring and evaluating equipment, environmental and physiological conditions
US6738485B1 (en) * 1999-05-10 2004-05-18 Peter V. Boesen Apparatus, method and system for ultra short range communication
US6762692B1 (en) * 1998-09-21 2004-07-13 Thomson Licensing S.A. System comprising a remote controlled apparatus and voice-operated remote control device for the apparatus
US20040203387A1 (en) * 2003-03-31 2004-10-14 Sbc Knowledge Ventures, L.P. System and method for controlling appliances with a wireless data enabled remote control
US7034678B2 (en) * 2002-07-02 2006-04-25 Tri-Sentinel, Inc. First responder communications system
US20060125630A1 (en) * 2002-12-23 2006-06-15 Scott Technologies, Inc. Dual-mesh network and communication system for emergency services personnel
US20060136220A1 (en) * 2004-12-22 2006-06-22 Rama Gurram Controlling user interfaces with voice commands from multiple languages
US7263379B1 (en) * 2002-12-23 2007-08-28 Sti Licensing Corp. Communications network for emergency services personnel
US7418281B2 (en) * 2005-09-13 2008-08-26 International Business Machines Corporation Centralized voice recognition unit for wireless control of personal mobile electronic devices

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61117601A (en) 1984-11-14 1986-06-05 Hitachi Elevator Eng & Serv Co Ltd Safety device of voice input type
JPS62252223A (en) 1986-04-25 1987-11-04 Nec Corp Movable sound input system
JPH04351094A (en) 1991-05-28 1992-12-04 Nec Corp Remote controller
JPH1011084A (en) 1996-06-21 1998-01-16 Fujitsu Ten Ltd Voice input device for on-vehicle navigation system
JP3829005B2 (en) 1998-02-26 2006-10-04 大日本印刷株式会社 Virtual environment presentation device
JPH11296073A (en) 1998-04-14 1999-10-29 Goto Kogaku Kenkyusho:Kk Operation system for planetarium theater
JP2000074466A (en) 1998-08-26 2000-03-14 Sanyo Electric Co Ltd Remote controller and air-conditioner
JP2002062894A (en) 2000-08-17 2002-02-28 Daiichikosho Co Ltd Wireless microphone having voice recognition remote control function
JP2002135868A (en) 2000-10-21 2002-05-10 Katsuya Tanaka System for controlling home electric appliance in voice
KR100368757B1 (en) 2001-02-13 2003-01-24 삼성전자 주식회사 Wireless headset apparatus for automatically establishing link and method thereof
JP4126888B2 (en) 2001-06-04 2008-07-30 セイコーエプソン株式会社 Short-range wireless interface built-in clock
JP2003323257A (en) 2002-05-01 2003-11-14 Canon Inc Wireless mouse with microphone
KR100512960B1 (en) 2002-09-26 2005-09-07 삼성전자주식회사 Flexible MEMS transducer and its manufacturing method, and flexible MEMS wireless microphone

Patent Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4665544A (en) * 1984-09-05 1987-05-12 Mitsubishi Denki Kabushiki Kaisha Home control system and interphone system
US5086385A (en) * 1989-01-31 1992-02-04 Custom Command Systems Expandable home automation system
US5237305A (en) * 1990-11-30 1993-08-17 Mitsubishi Denki Kabushiki Kaisha Home bus system
US5657425A (en) * 1993-11-15 1997-08-12 International Business Machines Corporation Location dependent verbal command execution in a computer based control system
US5579221A (en) * 1993-12-31 1996-11-26 Samsung Electronics Co., Ltd. Home automation system having user controlled definition function
US5621662A (en) * 1994-02-15 1997-04-15 Intellinet, Inc. Home automation system
US5878394A (en) * 1994-04-21 1999-03-02 Info Byte Ag Process and device for the speech-controlled remote control of electrical consumers
EP0772184A1 (en) * 1995-11-06 1997-05-07 THOMSON multimedia S.A. Vocal identification of devices in a home environment
US5864481A (en) * 1996-01-22 1999-01-26 Raytheon Company Integrated, reconfigurable man-portable modular system
US6088595A (en) * 1997-11-01 2000-07-11 Lucent Technologies Inc. Arrangement for configuring multiple portable units for communicating with each other in a frequency hopping system
US6405261B1 (en) * 1997-11-26 2002-06-11 International Business Machines Corporation Method and apparatus for an automatic multi-rate wireless/wired computer network
US6762692B1 (en) * 1998-09-21 2004-07-13 Thomson Licensing S.A. System comprising a remote controlled apparatus and voice-operated remote control device for the apparatus
US6208972B1 (en) * 1998-12-23 2001-03-27 Richard Grant Method for integrating computer processes with an interface controlled by voice actuated grammars
US6513063B1 (en) * 1999-01-05 2003-01-28 Sri International Accessing network-based electronic information through scripted online interfaces using spoken input
US6738485B1 (en) * 1999-05-10 2004-05-18 Peter V. Boesen Apparatus, method and system for ultra short range communication
US6538623B1 (en) * 1999-05-13 2003-03-25 Pirooz Parnian Multi-media data collection tool kit having an electronic multi-media “case” file and method of use
US6266995B1 (en) * 1999-05-20 2001-07-31 Respiratory Management Services, Inc. Portable medical gas system tester
US20020008625A1 (en) * 2000-02-29 2002-01-24 Adams Jonathan D. Remote accountability system and method
US6167413A (en) * 2000-03-09 2000-12-26 Daley, Iii; Charles A. Wearable computer apparatus
US6654720B1 (en) * 2000-05-09 2003-11-25 International Business Machines Corporation Method and system for voice control enabling device in a service discovery network
US6443347B1 (en) * 2000-10-19 2002-09-03 International Business Machines Corporation Streamlined personal harness for supporting a wearable computer and associated equipment on the body of a user
US20020186180A1 (en) * 2000-11-30 2002-12-12 William Duda Hands free solar powered cap/visor integrated wireless multi-media apparatus
US20030060202A1 (en) * 2001-08-28 2003-03-27 Roberts Robin U. System and method for enabling a radio node to selectably function as a router in a wireless communications network
US20030174049A1 (en) * 2002-03-18 2003-09-18 Precision Dynamics Corporation Wearable identification appliance that communicates with a wireless communications network such as bluetooth
US20040004547A1 (en) * 2002-05-17 2004-01-08 Fireeye Development Incorporated System and method for identifying, monitoring and evaluating equipment, environmental and physiological conditions
US6995665B2 (en) * 2002-05-17 2006-02-07 Fireeye Development Incorporated System and method for identifying, monitoring and evaluating equipment, environmental and physiological conditions
US20030236821A1 (en) * 2002-06-05 2003-12-25 Goun-Zong Jiau Body wearable personal network server and system
US7034678B2 (en) * 2002-07-02 2006-04-25 Tri-Sentinel, Inc. First responder communications system
US20060125630A1 (en) * 2002-12-23 2006-06-15 Scott Technologies, Inc. Dual-mesh network and communication system for emergency services personnel
US7263379B1 (en) * 2002-12-23 2007-08-28 Sti Licensing Corp. Communications network for emergency services personnel
US20040203387A1 (en) * 2003-03-31 2004-10-14 Sbc Knowledge Ventures, L.P. System and method for controlling appliances with a wireless data enabled remote control
US20060136220A1 (en) * 2004-12-22 2006-06-22 Rama Gurram Controlling user interfaces with voice commands from multiple languages
US7418281B2 (en) * 2005-09-13 2008-08-26 International Business Machines Corporation Centralized voice recognition unit for wireless control of personal mobile electronic devices

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9646610B2 (en) 2012-10-30 2017-05-09 Motorola Solutions, Inc. Method and apparatus for activating a particular wireless communication device to accept speech and/or voice commands using identification data consisting of speech, voice, image recognition
US9144028B2 (en) 2012-12-31 2015-09-22 Motorola Solutions, Inc. Method and apparatus for uplink power control in a wireless communication system
US9854032B2 (en) 2016-02-05 2017-12-26 International Business Machines Corporation Context-aware task offloading among multiple devices
US10044798B2 (en) 2016-02-05 2018-08-07 International Business Machines Corporation Context-aware task offloading among multiple devices
US10484484B2 (en) 2016-02-05 2019-11-19 International Business Machines Corporation Context-aware task processing for multiple devices
US10484485B2 (en) 2016-02-05 2019-11-19 International Business Machines Corporation Context-aware task processing for multiple devices
WO2019013349A1 (en) * 2017-07-14 2019-01-17 ダイキン工業株式会社 Air conditioner, air-conditioning system, communication system, control system, machinery control system, machinery management system, and sound information analysis system
CN110678701A (en) * 2017-07-14 2020-01-10 大金工业株式会社 Air conditioner, air conditioning system, communication system, control system, equipment management system, and sound information analysis system
JPWO2019013349A1 (en) * 2017-07-14 2020-08-27 ダイキン工業株式会社 Air conditioners, air conditioning systems, communication systems, control systems, equipment control systems, equipment management systems, and sound information analysis systems
JP7060812B2 (en) 2017-07-14 2022-04-27 ダイキン工業株式会社 Air conditioner, air conditioning system, communication system, control system, equipment control system, equipment management system and sound information analysis system
CN114909780A (en) * 2017-07-14 2022-08-16 大金工业株式会社 Control system

Also Published As

Publication number Publication date
US7418281B2 (en) 2008-08-26
US20070060118A1 (en) 2007-03-15

Similar Documents

Publication Publication Date Title
US7418281B2 (en) Centralized voice recognition unit for wireless control of personal mobile electronic devices
US7035091B2 (en) Wearable computer system and modes of operating the system
EP2314077B1 (en) Wearable headset with self-contained vocal feedback and vocal command
US10367951B2 (en) Wireless ringer
US10817251B2 (en) Dynamic capability demonstration in wearable audio device
US11336990B2 (en) Decorative wireless communication system and module thereof
US10299025B2 (en) Wearable electronic system
US20080267433A1 (en) Bone-Conduction Loudspeaker Set, Electronic Equipment, Electronic Translation System, Auditory Support System, Navigation Apparatus, and Cellular Phone
WO2015006196A1 (en) Method and system for communicatively coupling a wearable computer with one or more non-wearable computers
KR20080026954A (en) Method for providing bluetooth auto pairing in mobile terminal
KR20150099156A (en) Wireless receiver and method for controlling the same
US10922044B2 (en) Wearable audio device capability demonstration
CN109429132A (en) Earphone system
KR20180033185A (en) Earset and its control method
JP2003520497A (en) Communication system and suitable control unit therefor
WO2006037814A2 (en) Personal communications device
US20090197533A1 (en) Bluetooth remote control wristwatch and related modification of bluetooth modules
KR101846218B1 (en) Language interpreter, speech synthesis server, speech recognition server, alarm device, lecture local server, and voice call support application for deaf auxiliaries based on the local area wireless communication network
US20230379615A1 (en) Portable audio device
KR100883102B1 (en) Method for providing condition of Headset and thereof
JP2015099974A (en) Speech receiving method and speech transmitting/receiving device
KR101861357B1 (en) Bluetooth device having function of sensing external noise
KR102270566B1 (en) Housing device
JP5871041B2 (en) Electronic device, control method and program for causing other electronic devices to execute set function
KR20170029255A (en) Smart watch

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTERNATIONAL BUSINESS MACHINES CORPORATION, NEW Y

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GUYETTE, THOMAS M;REEL/FRAME:021107/0305

Effective date: 20050906

AS Assignment

Owner name: NUANCE COMMUNICATIONS, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:022330/0088

Effective date: 20081231

Owner name: NUANCE COMMUNICATIONS, INC.,MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTERNATIONAL BUSINESS MACHINES CORPORATION;REEL/FRAME:022330/0088

Effective date: 20081231

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION