US20080318595A1 - Position location system using multiple position location techniques - Google Patents

Position location system using multiple position location techniques Download PDF

Info

Publication number
US20080318595A1
US20080318595A1 US12/142,702 US14270208A US2008318595A1 US 20080318595 A1 US20080318595 A1 US 20080318595A1 US 14270208 A US14270208 A US 14270208A US 2008318595 A1 US2008318595 A1 US 2008318595A1
Authority
US
United States
Prior art keywords
position location
gaming
sub
location information
player
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/142,702
Inventor
Ahmadreza (Reza) Rofougaran
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/142,702 priority Critical patent/US20080318595A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROFOUGARAN, AHMADREZA REZA
Publication of US20080318595A1 publication Critical patent/US20080318595A1/en
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • G01S13/426Scanning radar, e.g. 3D radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • G01S7/412Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/003Bistatic radar systems; Multistatic radar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact

Definitions

  • This invention relates generally to position location systems and more particularly to determining position of one or more objects within a position location system.
  • Radio frequency (RF) wireless communication systems may operate in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof.
  • RF radio frequency
  • GSM global system for mobile communications
  • CDMA code division multiple access
  • LMDS local multi-point distribution systems
  • MMDS multi-channel-multi-point distribution systems
  • IrDA Infrared Data Association
  • a wireless communication device such as a cellular telephone, two-way radio, personal digital assistant (PDA), personal computer (PC), laptop computer, home entertainment equipment, RFID reader, RFID tag, et cetera communicates directly or indirectly with other wireless communication devices.
  • PDA personal digital assistant
  • PC personal computer
  • laptop computer home entertainment equipment
  • RFID reader RFID tag
  • et cetera communicates directly or indirectly with other wireless communication devices.
  • direct communications also known as point-to-point communications
  • the participating wireless communication devices tune their receivers and transmitters to the same channel or channels (e.g., one of the plurality of radio frequency (RF) carriers of the wireless communication system) and communicate over that channel(s).
  • RF radio frequency
  • each wireless communication device communicates directly with an associated base station (e.g., for cellular services) and/or an associated access point (e.g., for an in-home or in-building wireless network) via an assigned channel.
  • an associated base station e.g., for cellular services
  • an associated access point e.g., for an in-home or in-building wireless network
  • the associated base stations and/or associated access points communicate with each other directly, via a system controller, via the public switch telephone network, via the Internet, and/or via some other wide area network.
  • each RF wireless communication device For each RF wireless communication device to participate in wireless communications, it includes a built-in radio transceiver (i.e., receiver and transmitter) or is coupled to an associated radio transceiver (e.g., a station for in-home and/or in-building wireless communication networks, RF modem, etc.).
  • the receiver is coupled to the antenna and includes a low noise amplifier, one or more intermediate frequency stages, a filtering stage, and a data recovery stage.
  • the low noise amplifier receives inbound RF signals via the antenna and amplifies then.
  • the one or more intermediate frequency stages mix the amplified RF signals with one or more local oscillations to convert the amplified RF signal into baseband signals or intermediate frequency (IF) signals.
  • the filtering stage filters the baseband signals or the IF signals to attenuate unwanted out of band signals to produce filtered signals.
  • the data recovery stage recovers raw data from the filtered signals in accordance with the particular wireless communication standard.
  • the transmitter includes a data modulation stage, one or more intermediate frequency stages, and a power amplifier.
  • the data modulation stage converts raw data into baseband signals in accordance with a particular wireless communication standard.
  • the one or more intermediate frequency stages mix the baseband signals with one or more local oscillations to produce RF signals.
  • the power amplifier amplifies the RF signals prior to transmission via an antenna.
  • radio transceivers are implemented in one or more integrated circuits (ICs), which are inter-coupled via traces on a printed circuit board (PCB).
  • ICs integrated circuits
  • PCB printed circuit board
  • the radio transceivers operate within licensed or unlicensed frequency spectrums.
  • WLAN wireless local area network
  • ISM Industrial, Scientific, and Medical
  • an IR device in IR communication systems, includes a transmitter, a light emitting diode, a receiver, and a silicon photo diode.
  • the transmitter modulates a signal, which drives the LED to emit infrared radiation which is focused by a lens into a narrow beam.
  • the receiver via the silicon photo diode, receives the narrow beam infrared radiation and converts it into an electric signal.
  • IR communications are used in video games to detect the direction in which a game controller is pointed.
  • an IR sensor is placed near the game display, where the IR sensor detects the IR signal transmitted by the game controller. If the game controller is too far away, too close, or angled away from the IR sensor, the IR communication will fail.
  • Further advances in video gaming include three accelerometers in the game controller to detect motion by way of acceleration.
  • the motion data is transmitted to the game console via a Bluetooth wireless link.
  • the Bluetooth wireless link may also transmit the IR direction data to the game console and/or convey other data between the game controller and the game console.
  • the IR communication has a limited area in which a player can be for the IR communication to work properly.
  • the accelerometer only measures acceleration such that true one-to-one detection of motion is not achieved.
  • the gaming motion is limited to a handful of directions (e.g., horizontal, vertical, and a few diagonal directions).
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a gaming system in accordance with the present invention
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system in accordance with the present invention
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 4 is a schematic block diagram of a side view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 5 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 6 is a schematic block diagram of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 7 is a schematic block diagram of another embodiment of a gaming system in accordance with the present invention.
  • FIGS. 8-10 are diagrams of an embodiment of a coordinate system of a gaming system in accordance with the present invention.
  • FIGS. 11-13 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with the present invention.
  • FIG. 14 is a diagram of a method for determining position and/or motion tracking in accordance with the present invention.
  • FIGS. 15A and 15B are diagrams of other methods for determining position and/or motion tracking in accordance with the present invention.
  • FIGS. 16-18 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with the present invention.
  • FIGS. 19-21 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with the present invention.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 23 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 24 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 25 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 26 is a diagram of another embodiment of a coordinate system of a gaming system in accordance with the present invention.
  • FIG. 27 is a schematic block diagram of an embodiment of a wireless communication system in accordance with the present invention.
  • FIG. 28 is a schematic block diagram of another embodiment of a wireless communication system in accordance with the present invention.
  • FIG. 29 is a schematic block diagram of another embodiment of a wireless communication system in accordance with the present invention.
  • FIG. 30 is a schematic block diagram of an overhead view of an embodiment of determining position and/or motion tracking in accordance with the present invention.
  • FIG. 31 is a schematic block diagram of a side view of an embodiment of determining position and/or motion tracking in accordance with the present invention.
  • FIG. 32 is a schematic block diagram of an embodiment of transceiver in accordance with the present invention.
  • FIG. 33 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 34 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 35 is a schematic block diagram of an embodiment of a wireless communication in accordance with the present invention.
  • FIG. 36 is a diagram of an embodiment of an antenna pattern in accordance with the present invention.
  • FIG. 37 is a diagram of another embodiment of an antenna pattern in accordance with the present invention.
  • FIG. 38 is a diagram of an example of receiving an RF signal in accordance with the present invention.
  • FIG. 39 is a diagram of an example of frequency dependent in-air attenuation in accordance with the present invention.
  • FIGS. 40 and 41 are diagrams of an example of frequency dependent distance calculation in accordance with the present invention.
  • FIG. 42 is a diagram of an example of constructive and destructive signaling in accordance with the present invention.
  • FIG. 43 is a diagram of another example of constructive and destructive signaling in accordance with the present invention.
  • FIG. 44 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 45 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 46 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 47 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 48 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 49 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention.
  • FIG. 50 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 51 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 52 is a schematic block diagram of a side view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 53 is a schematic block diagram of an embodiment of an RFID reader and an RFID tag in accordance with the present invention.
  • FIG. 54 is a diagram of a method for determining position in accordance with the present invention.
  • FIG. 55 is a schematic block diagram of an embodiment of a gaming object in accordance with the present invention.
  • FIG. 56 is a schematic block diagram of an embodiment of three-dimensional antenna structure in accordance with the present invention.
  • FIG. 57 is a diagram of an example of an antenna radiation pattern in accordance with the present invention.
  • FIGS. 58 and 59 are diagrams of an example of frequency dependent motion calculation in accordance with the present invention.
  • FIG. 60 is a diagram of a method for determining motion in accordance with the present invention.
  • FIG. 61 is a diagram of an example of determining a motion vector in accordance with the present invention.
  • FIG. 62 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 63 is a diagram of an example of audio and near audio frequency bands in accordance with the present invention.
  • FIG. 64 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 65 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 66 is a schematic block diagram of an overhead view of yet another embodiment of a position location system in accordance with the present invention.
  • FIG. 67 is a flow chart illustrating operations of a position location system employing multiple position location techniques.
  • FIG. 68 is a flow chart illustrating usage of multiple position location techniques for locating an object.
  • FIG. 69 is a flow chart illustrating usage of multiple position location techniques for determining position and motion of an object.
  • FIG. 70 is a flow chart illustrating operation for using multiple position location techniques to determine position and orientation of an object.
  • FIG. 71 is a flow chart illustrating operation for using multiple position location techniques to determine positions of multiple objects.
  • FIG. 72 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects.
  • FIG. 73 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects.
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a video gaming system 10 that includes a game console device 12 and a gaming object 14 associated with a player 16 .
  • the video gaming system 10 is within a gaming environment 22 , which may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 can be proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 determines the gaming environment 22 . This may be done by sweeping the area with one or more signals within one or more frequency bands.
  • the one or more signals may be in the ultrasound frequency band of 20 KHz to 200 MHz, the radio frequency band of 30 HZ to 3 GHz, the microwave frequency band of 3 GHz to 300 GHz, the infrared (IR) frequency band of 300 GHz to 428 THz, the visible light frequency band of 428 THz to 750 THz (n ⁇ 10 12 ), the ultraviolet radiation frequency band of 750 THz to 30 PHz (n ⁇ 10 15 ), and/or the X-Ray frequency band of 30 PHz to 30 EHz (n ⁇ 10 18 ).
  • the ultrasound frequency band of 20 KHz to 200 MHz the radio frequency band of 30 HZ to 3 GHz
  • the microwave frequency band of 3 GHz to 300 GHz the infrared (IR) frequency band of 300 GHz to 428 THz
  • the visible light frequency band of 428 THz to 750 THz n ⁇ 10 12
  • the ultraviolet radiation frequency band of 750 THz to 30 PHz n ⁇ 10 15
  • the determination of the gaming environment 22 continues with the gaming console device 12 measuring at least one of: reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects.
  • the game console device 12 identifies different objects based on the measured signal effects (e.g., inanimate objects have different reflective, absorption, pass through, and/or refractive properties of the one or more signals than animate beings).
  • the game console device 12 determines distance of the different objects with respect to itself. From this data, the game console device 12 generates a three-dimensional topographic map of the area in which the video gaming system 10 resides to produce the gaming environment 22 .
  • the gaming environment 22 includes the player 16 , the gaming object 14 , a couch, a chair, a desk, the four encircling walls, the floor, and the ceiling.
  • the game console device 12 maps the gaming environment 22 to a coordinate system (e.g., a three-dimensional Cartesian coordinate system [x, y, x], a spherical coordinate system [ ⁇ , ⁇ , ⁇ ], etc.). The game console device 12 then determines the position 18 of the player 16 and/or the gaming object 14 within the gaming environment in accordance with the coordinate system.
  • a coordinate system e.g., a three-dimensional Cartesian coordinate system [x, y, x], a spherical coordinate system [ ⁇ , ⁇ , ⁇ ], etc.
  • the game console device 12 tracks the motion 20 of the player 16 and/or the gaming object 14 .
  • the game console device 12 may determine the position 18 of the gaming object 14 and/or the player 16 within a positioning tolerance (e.g., within a meter) at a positioning update rate (e.g., once every second or once every few seconds) and tracks the motion 20 within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds).
  • a positioning tolerance e.g., within a meter
  • a positioning update rate e.g., once every second or once every few seconds
  • the game console device 12 receives a gaming object response regarding a video game function from the gaming object 14 .
  • the gaming object 14 may be a wireless game controller and/or any object used or worn by the player to facilitate play of a video game.
  • the gaming object 14 may be a simulated sword, a simulated gun, a helmet, a vest, a hat, shoes, socks, pants, shorts, gloves, etc.
  • the game console device 12 integrates the gaming object response and the motion 20 of the player and/or the gaming object 14 with the video game function. For example, if the video game function corresponds to a video tennis lesson (e.g., a ball machine feeding balls), the game console device 12 tracks the motion of the player 16 and the associated gaming object 14 (e.g., a simulated tennis racket) and maps the motion 20 with the feeding balls to emulate a real tennis lesson.
  • the motion 20 which includes direction and velocity, enables the game console device 12 to determine how the tennis ball is being struck. Based on how it is being struck, the game console device 12 determines the ball's path and provides a video representation thereof.
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system 10 of FIG. 1 to illustrate that the position 18 and motion tracking 20 are done in three-dimensional space. Since the game console device 12 does three-dimensional positioning 18 and motion tracking 20 , the distance and/or angle of the gaming object 14 and/or player 16 to the game console device 12 is a negligible factor. As such, the gaming system 10 provides accurate motion tracking of the gaming object 14 and/or player 16 , which may be used to map the player's movements to a graphics image for true interactive video game play.
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes the game console device 12 , the gaming objects 14 - 15 , and one or more peripheral sensors 36 - 40 .
  • the game console device 12 includes a video display interface 34 (e.g., a video display driver, a video graphics accelerator, a video graphics engine, a video graphics array (VGA) card, etc.), a transceiver 32 (which may include a peripheral sensor), and a processing module 30 .
  • the processing module 30 may be a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • the processing module 30 may have an associated memory and/or memory element (not shown), which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the processing module 30 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64 .
  • the transceiver 32 In operation, the transceiver 32 generates the one or more signals within one or more frequency bands for sweeping the area to facilitate the determination of the gaming environment. In addition, the transceiver 32 generates signals during video game play to facilitate the determination of the gaming objects' and/or the player's position 18 and generates signals to facilitate the determination of the gaming object's and/or the player's motion 20 .
  • the transceiver 32 may utilize a first technique, which provides a first tolerance, (e.g., accuracy within a meter as may be obtained by a 2.4 GHz or 5 GHz localized positioning system as will be discussed with reference to FIGS.
  • a second technique which provides a second tolerance (e.g., accuracy within a few millimeters as may be obtained by a 60 GHz localized positioning system as will be discussed with reference to FIGS. 6 , 7 , 27 - 29 or a 60 GHz millimeter wave (MMW) radar system as will be discussed with reference to FIGS. 30-34 ).
  • a second tolerance e.g., accuracy within a few millimeters as may be obtained by a 60 GHz localized positioning system as will be discussed with reference to FIGS. 6 , 7 , 27 - 29 or a 60 GHz millimeter wave (MMW) radar system as will be discussed with reference to FIGS. 30-34 ).
  • the transceiver 32 receives responses (e.g., reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, a response to the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects), converts the responses to one or more digital signals, and provides the one or more digital signals to the processing module 30 .
  • responses e.g., reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, a response to the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects
  • the transceiver 32 may be an ultrasound transceiver that transmits one or more ultrasound signals within an ultrasound frequency band.
  • the ultrasound transcevier receives at least one inbound ultrasound signal (e.g., reflection, refraction, echo, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • at least one inbound ultrasound signal e.g., reflection, refraction, echo, etc.
  • the transceiver 32 may be a radio frequency (RF) transceiver that transmits one or more signals within a radio frequency band.
  • the RF transceiver receives at least one inbound RF signal (e.g., reflection, refraction, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • RF radio frequency
  • the transceiver 32 is a microwave transceiver that transmits the one or more signals within a microwave frequency band.
  • the microwave transceiver receives at least one inbound microwave signal (e.g., reflection, refraction, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • at least one inbound microwave signal e.g., reflection, refraction, response, backscatter, etc.
  • the transceiver 32 is an infrared transceiver that transmits the one or more signals within an infrared frequency band.
  • the infrared transceiver receives at least one inbound infrared signal (e.g., reflection, refraction, angle of incidence, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • at least one inbound infrared signal e.g., reflection, refraction, angle of incidence, response, backscatter, etc.
  • the transceiver 32 is a laser transceiver that transmits the one or more signals within a visible light frequency band.
  • the laser transceiver which may use fiber optics, receives at least one inbound visible light signal (e.g., reflection, refraction, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • at least one inbound visible light signal e.g., reflection, refraction, response, backscatter, etc.
  • the transceiver 32 is a digital camera that utilizes ambient light as the one or more signals within the visible light frequency band.
  • the digital camera receives the at least one inbound visible light signal (e.g., reflection and/or refraction of light off the gaming environment, the player, and the gaming object) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • the at least one inbound visible light signal e.g., reflection and/or refraction of light off the gaming environment, the player, and the gaming object
  • the transceiver 32 is an ultraviolet transceiver that transmits the one or more signals within an ultraviolet radiation frequency band.
  • the ultraviolet transceiver receives at least one inbound ultraviolet radiation signal (e.g., reflection, absorption, and/or refraction of UV light off the gaming environment, the player, and the gaming object) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • at least one inbound ultraviolet radiation signal e.g., reflection, absorption, and/or refraction of UV light off the gaming environment, the player, and the gaming object
  • the transceiver 32 is an X-ray transceiver that transmits the one or more signals within an X-ray frequency band.
  • the X-ray transceiver receives at least one inbound X-ray signal (e.g., reflection, absorption, and/or refraction of UV light off the player and/or the gaming object) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • at least one inbound X-ray signal e.g., reflection, absorption, and/or refraction of UV light off the player and/or the gaming object
  • the transceiver 32 is a magnetic source that transmits the one or more signals as one or more magnetic signals.
  • the magnetic source receives at least one inbound magnetic field that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • the magnetic source may include three coils to generate magnetic gradients in the x, y and z directions of the magnetic source. The coils may be powered by amplifiers that enable rapid and precise adjustments of the coil's field strength and direction.
  • the transcevier 32 may include one or more of the ultrasound transceiver, the RF transceiver, the microwave transceiver, the infrared transceiver, the laser transceiver, the digital camera, the ultraviolet transceiver, the X-ray transceiver, and the magnetic source transceiver.
  • the processing module 30 receives the one or more digital signals from the transceiver 32 and processes them to determine the gaming environment 22 , the position 18 of the player 16 and/or the gaming objects 14 - 15 , and the motion 20 of the player 16 and/or the gaming object 14 .
  • Such processing includes one or more of determining reflection of the one or more signals, determining the absorption of the one or more signals, determining refraction of the one or more signals, determining the pass through of the one or more signals, determining the angle of incident of the one or more signals, interpreting the backscattering of the one or more signals, interpreting a signal response, and determining the magnetization induced by the one or more signals.
  • the process further includes identifying objects, players, and gaming objects based on the preceding determinations and/or interpretations.
  • the one or more peripheral sensors 36 - 40 which may be a ultrasound transceiver, the RF transceiver, the microwave transceiver, the infrared transceiver, the laser transceiver, the digital camera, the ultraviolet transceiver, the X-ray transceiver, the magnetic source transceiver, an access point, a local positioning system transmitter, a local positioning system receiver, etc., transmits one or more signals and receives responses thereto that facilitate the determination of the player's and/or gaming object's position 18 and/or motion 20 .
  • the peripheral sensors 36 - 40 may be enabled at the same time using different frequencies, different time slots, time-space encoding, frequency-spacing encoding, or may enabled at different times in a round robin, poling, or token passing manner.
  • the game console device 12 receives a first gaming object response regarding the video game function from the first associated gaming object 14 and a second gaming object response regarding the video game function from the second associated gaming object 15 .
  • the game console device 12 integrates the first gaming object response, the second gaming object response, the motion of the first player, the motion of the second player, the motion of the first associated gaming object, and the motion of the second associated gaming object with the video game function.
  • the position and motion tracking apparatus may be used for home security, baby monitoring, store security, shop-lifting detection, concealed weapon detection, etc.
  • Such an apparatus includes a transceiver section and a processing module.
  • the transceiver section transmits one or more signals within one or more frequency bands in a given area.
  • the one or more signals may be in the ultrasound frequency band of 20 KHz to 200 MHz, the radio frequency band of 30 HZ to 3 GHz, the microwave frequency band of 3 GHz to 300 GHz, the infrared (IR) frequency band of 300 GHz to 428 THz, the visible light frequency band of 428 THz to 750 THz (n ⁇ 10 12 ), the ultraviolet radiation frequency band of 750 THz to 30 PHz (n ⁇ 10 15 ), and/or the X-Ray frequency band of 30 PHz to 30 EHz (n ⁇ 10 18 ).
  • IR infrared
  • the processing module processes the digital response signal to determine a measure of at least one of: reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects.
  • the apparatus then identifies different objects based on the measured signal effects (e.g., inanimate objects have different reflective, absorption, pass through, and/or refractive properties of the one or more signals than animate beings).
  • the processing module determines distance of the different objects with respect to itself. From this data, the apparatus generates a three-dimensional topographic map of the area to produce a digital representation of the environment. The apparatus then maps the environment to a coordinate system (e.g., a three-dimensional Cartesian coordinate system [x, y, x], a spherical coordinate system [ ⁇ , ⁇ , ⁇ ], etc.) and determines the position of an object or person within the environment in accordance with the coordinate system.
  • a coordinate system e.g., a three-dimensional Cartesian coordinate system [x, y, x], a spherical coordinate system [ ⁇ , ⁇ , ⁇ ], etc.
  • the processing module tracks the motion of the object or person.
  • the transceiver section receives responses that provide millimeter accuracy of the object and or person (e.g., 60 GHz signals, light, etc.) and converts the responses to digital signals.
  • the processing module processes the digital signals with respect to the environment and the object or person to track motion.
  • FIG. 4 is a schematic block diagram of a side view of another embodiment of a gaming system 10 that includes one or more gaming objects 14 - 15 , the player 16 , the game console device 12 , and one or more sensing tags 44 proximal to the player 16 and/or to the gaming object 14 - 15 .
  • the one or more sensing tags 44 may be a metal patch, an RFID tag, a light reflective material, a light absorbent material, a specific RGB [red, green, blue] color, a 60 GHz transceiver, and/or any other component, material, and/or texture that assists the game console device 12 in determining the position and/or motion of the player 16 and/or the gaming object 14 - 15 .
  • the metal patch will reflect RF and/or microwave signals at various angles depending on the position of the metal patch with respect to the game console device 12 .
  • the game console device 12 utilizes the various angles to determine the position and/or motion of the player 16 and/or the gaming object 14 - 15 .
  • the gaming objects 14 - 15 may include a game controller that is held by the player and may further include a helmet, a shirt, pants, gloves, and/or socks, which are worn by the player.
  • Each of the gaming objects 14 - 15 includes one or more sensing tags 44 , which facilitate the determining of the position 18 and/or motion 20 .
  • An example of a gaming system 10 using RFID tags will be discussed with reference to FIGS. 51-54 .
  • FIG. 5 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a plurality of players 16 , 50 and a plurality of gaming objects 14 , 52 .
  • the game console device 12 determines the position 18 of the first player 16 and/or the associated gaming object 14 within the gaming environment 22 in accordance with the coordinate system.
  • the game console device 12 also determines the position 54 of the second player 50 and/or the associated gaming object 52 within the gaming environment in accordance with the coordinate system.
  • the game console device 12 separately tracks the motion 20 of the first player 16 , the motion 20 of the first associated gaming object 14 , the motion 56 of the second player 50 , and the motion 56 of the second associated gaming object 52 . While tracking the motion of the players and/or gaming objects, the game console may receive a gaming object response regarding the video game function from the first and/or the second associated gaming object 14 , 52 .
  • the game console device 12 integrates the first and/or second gaming object response, the motion of the first player, the motion of the second player, the motion of the first associated gaming object, and the motion of the second associated gaming object with the video game function. While the present example shows two players and associated gaming objects, more than two players and associated gaming objects could be in the gaming environment. In this instance, the game console device 12 separately determines the position and the motion of the players and the associated gaming objects as previously discussed and integrates their play in the video gaming graphics being displayed.
  • FIG. 6 is a schematic block diagram of another embodiment of a gaming system 10 that includes a game console device 12 , a plurality of localized position system (LPS) transmitters 60 - 64 , at least one gaming object 14 , and an LPS receiver 66 associated with the gaming object 14 .
  • the LPS receiver 66 and the gaming object 14 may be separate devices or an integrated device.
  • the LPS receiver 66 may be a packaged printed circuit board (PCB) that includes an integrated circuit (IC) LPS receiver and the gaming object 14 is a game controller, where the packaged PCB is attachable to the game controller.
  • the gaming object 14 and the LPS receiver 66 may be integrated in a device, such as a cell phone, a game controller, a personal digital assistant, a handheld computing unit, etc.
  • Each LPS transmitter 60 - 64 includes an accurate clock (e.g., an atomic clock) or is coupled to an accurate clock source (e.g., has a global positioning system (GPS) receiver) to provide an accurate time standard available for synchronization at any point in the physical area.
  • Each LPS transmitter 60 - 64 transmits a spread spectrum signal containing a BPSK (Bi-Phase Switched keyed) signal in which 1's & 0's are represented by reversal of the phase of the carrier.
  • BPSK Bi-Phase Switched keyed
  • This message is transmitted at a specific frequency at a “chipping rate” of x bits per second (e.g., 50 bits per millisecond).
  • the message may repeat every 30 milliseconds (or more frequently) and may be referred as a local C/A signal (Coarse Acquisition signal).
  • This message contains information regarding the entire LPS and information regarding the LPS transmitter sending the local C/A signal.
  • the LPS receiver 66 identifies each LPS transmitter's 60 - 64 signals by their distinct C/A code pattern, and then measures the time delay for each LPS transmitter. To do this, the receiver 66 produces an identical C/A sequence using the same seed number as the LPS transmitter. By lining up the two sequences, the receiver can measure the delay and calculate the distance to the LPS transmitter 60 - 64 .
  • the LPS receiver 66 calculates the position of the corresponding plurality of LPS transmitters based on the local C/A signals. For example, the LPS receiver 66 uses the position data of the local C/A signals to calculate the LPS transmitter's position. The LPS receiver then determines its location based on the distance of the corresponding plurality of LPS transmitters and the position of the corresponding plurality of LPS transmitters 60 - 64 . For instance, by knowing the position and the distance of an LPS transmitter, the LPS receiver can determine it's location to be somewhere on the surface of an imaginary sphere centered on that LPS transmitter and whose radius is the distance to it. When four LPS transmitters 60 - 64 are measured simultaneously, the intersection of the four imaginary spheres reveals the location of the receiver. Often, these spheres will overlap slightly instead of meeting at one point, so the receiver will yield a mathematically most-probable position (and often indicate the uncertainty).
  • the LPS receiver 66 via the gaming object 14 , transmits its position within the coordinate system to the game console device 12 .
  • the LPS receiver 66 via the gaming object 14 , may provide the LPS transmitter distances (e.g., d 1 , d 2 , and d 3 ) to the game console device 12 such that the game console device 12 can determine the position of the gaming object within the gaming environment.
  • the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may also be used to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • FIG. 7 is a schematic block diagram of another embodiment of a gaming system 10 that includes a game console device 12 , at least one gaming object 14 , a player 16 , a local positioning system (LPS) transmitter 74 , and a plurality of LPS receivers 68 - 72 .
  • the LPS transmitter 74 and the gaming object 14 may be separate devices or an integrated device.
  • the LPS transmitter 74 may be a packaged printed circuit board (PCB) that includes an integrated circuit (IC) LPS transmitter and the gaming object 14 is a game controller, where the packaged PCB is attachable to the game controller.
  • the gaming object 14 and the LPS transmitter 74 may be integrated in a device, such as a cell phone, a game controller, a personal digital assistant, a handheld computing unit, etc.
  • the LPS transmitter 74 includes an accurate clock and transmits a narrow pulse (e.g., pulse width less than 1 nano second) at a desired rate (e.g., once every milli second to once every few seconds).
  • the narrow pulse signal includes a time stamp of when it is transmitted.
  • the LPS receivers 68 - 72 receive the narrow pulse signal and determine their respective distances (e.g., d 1 , d 2 , and d 3 ) to the LPS transmitter 74 .
  • an LPS receiver 68 - 72 determines the distance to the LPS transmitter 74 based on the time stamp and the time at which the LPS receiver received the signal. Since the narrow pulse travels at the speed of light, the distance can be readily determined.
  • the plurality of distances between the LPS receivers 68 - 72 and the LPS transmitter 74 are then processed (e.g., by the game console device 12 or by a master LOS receiver) to determine the position of the LPS transmitter 74 within the local physical area in accordance with the known positioning of the LPS receivers 68 - 72 .
  • the LPS receiver (the game console device or a master LPS receiver) can determine the LPS transmitter's location to be somewhere on the surface of an imaginary sphere centered on the LPS receiver and whose radius is the distance to it.
  • the intersection of the four imaginary spheres reveals the location of the LPS transmitter 74 .
  • the processing of the LPS receiver to transmitter distances may be performed by a master LPS receiver, by the game console device 12 , by a motion tracking processing module, and/or by an LPS computer coupled to the plurality of LPS receivers.
  • the motion tracking processing module may be a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • the processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module.
  • a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64 .
  • the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may also be used to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • an LPS system may include both a plurality of LPS transmitters as in FIG. 6 and a plurality of LPS receivers as in FIG. 7 , where the LPS device on the person includes both the LPS receiver of FIG. 6 and the LPS transmitter of FIG. 7 .
  • the LPS transmitters of the FIG. 6 and the LPS transmitters of FIG. 7 may be stand-alone devices positioned through a localized physical area (e.g., a home, an office, a building, etc.) or may be included within a device that positioned through the localized physical area.
  • access points of a WLAN may be included in smoke detectors, motion detectors of a security system, speakers of an intercom system, light fixtures, light bulbs, electronic equipment (e.g., computers, TVs, radios, clocks, etc.), and/or any device or object found or used in a localized physical area.
  • FIGS. 8-10 are diagrams of an embodiment of a three-dimensional Cartesian coordinate system of a localized physical area that may be used for a gaming system 10 .
  • an x-y-z origin is selected to be somewhere in the localized physical area and the position and motion of the player 16 and/or the gaming object 14 is determined with respect to the origin (e.g., 0, 0, 0).
  • a point e.g., x 1 , y 1 , z 1
  • a point e.g., x 2 , y 2 , z 2
  • the gaming object 14 is used to identify its position in the gaming environment.
  • FIGS. 11-13 are diagrams of an embodiment of a spherical coordinate system of a localized physical area that may be used for a gaming system 10 .
  • an 30 origin is selected to be somewhere in the localized physical area and the position and motion of the player 16 and/or the gaming object 14 is determined with respect to the origin.
  • the position of the player may be represented as vector, or spherical coordinates, ( ⁇ , ⁇ , ⁇ ), where ⁇ 0 and is the distance from the origin to a given point P; 0 ⁇ 180° and is the angle between the positive z-axis and the line formed between the origin and P; and 0 ⁇ 360° and is the angle between the positive x-axis and the line from the origin to P projected onto the xy-plane.
  • ⁇ is referred to as the zenith, colatitude or polar angle
  • ⁇ is referred to as the azimuth.
  • a point is plotted from its spherical coordinates, by going p units from the origin along the positive z-axis, rotate ⁇ about the y-axis in the direction of the positive x-axis and rotate ⁇ about the z-axis in the direction of the positive y-axis.
  • a point e.g., ⁇ 1 ⁇ 1 , ⁇ 1
  • a point e.g., ⁇ 2 , ⁇ 2 , ⁇ 2
  • FIGS. 8-13 illustrate two types of coordinate system
  • any three-dimensional coordinate system may be used for tracking motion and/or establishing position within a gaming system.
  • FIG. 14 is a diagram of a method for determining position and/or motion tracking that begins at step 80 where the game console device determines the gaming environment 22 (e.g., determining the properties of the localized physical area such as height, width, depth, objects in the physical area, etc.). The method then continues at step 82 where the game console device maps the gaming environment to a coordinate system (e.g., Cartesian coordinate system of FIGS. 8-10 or spherical system of FIGS. 11-13 ). The method continues at step 84 where the game console device determines position of the player and/or the gaming object within the gaming environment in accordance with the coordinate system.
  • a coordinate system e.g., Cartesian coordinate system of FIGS. 8-10 or spherical system of FIGS. 11-13 .
  • the game console device tracks the motion of the player and/or the gaming object.
  • the game console device separately determines the players' position and separately tracks their motion.
  • the game console device separately determines the gaming objects' position and separately tracks their motion.
  • the game console device separately determines the players' position, separately tracks their motion, separately determines the gaming objects' position and separately tracks the gaming objects' motion.
  • the game console device may determine the new position of the player and/or gaming object every 10 mS and use the old and new positions to determine the motion of the player and/or gaming object.
  • the method continues at step 88 where the game console device receives a gaming object response regarding a video game function from a gaming object.
  • the method continues at step 90 where the game console device integrates the gaming object response and the motion of the at least one of the player and the gaming object with the video game function. If the system includes multiple players and/or multiple gaming objects, the game console device 12 integrates their motion into the video game graphics being displayed. If the game console device receives multiple gaming object responses, the game console device integrates them into the video game graphics being displayed.
  • FIG. 15A is a diagram of another method for determining position and/or motion tracking that begins at step 100 where an origin of a Cartesian coordinate system (e.g., the coordinate system of FIGS. 8-10 ) is determined.
  • the origin may be any other point within the localized physical area of the gaming environment (e.g., a point on the game console device).
  • the method continues in one or more branches.
  • the initial coordinates of the player are determined using one or more of a plurality of position determining techniques as described herein.
  • This branch continues at step 108 by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the other branch begins at step 102 where the coordinates of the gaming object's initial position are determined using one or more of a plurality of position determining techniques as described herein.
  • This branch continues at step 104 by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 1 millisecond provides 0.1 mm accuracy in motion tracking.
  • FIG. 15B is a diagram of another method for determining position and/or motion tracking that begins at step 110 by determining a reference point within a coordinate system (e.g., the vector coordinate system of FIGS. 11-13 ).
  • the reference point may be the origin or any other point within the localized physical area.
  • the method continues in one or more branches.
  • a vector with respect to the reference point is determined to indicate the player's initial position, which may be done by using one or more of a plurality of position determining techniques as described herein.
  • This branch continues at step 118 by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the other branch begins at step 112 by determining a vector with respect to the reference point for the gaming object to establish its initial position, which may be done by using one or more of a plurality of position determining techniques as described herein.
  • This branch continues at step 114 by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 1 millisecond provides 0.1 mm accuracy in motion tracking.
  • FIGS. 16-18 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system 10 .
  • an xyz origin is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its Cartesian coordinates with respect to the origin.
  • a point moves from one position (e.g., x 0 , y 0 , z 0 ) to a new position (e.g., x 1 , y 1 , z 1 )
  • the player and the gaming object may each have several points that are tracked and used to determine position and motion.
  • the positioning and motion tracking of the player (i.e., one or more points on the player) and/or gaming object (i.e., one or more points on the gaming object) may be done with respect to the origin or with respect to each other.
  • the gaming object's position and motion may be determined with reference to the origin and the position and motion of the player may be determined with reference to the position and motion of the gaming object.
  • the player's position and motion may be determined with reference to the origin and the position motion of the gaming object may be determined with reference to the player's potion and motion.
  • FIGS. 19-21 are diagrams of an embodiment of a spherical coordinate system of a localized physical area that may be used for a gaming system 10 .
  • an origin, or reference point is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its spherical coordinates with respect to the origin.
  • the player and the gaming object may each have several points that are tracked and used to determine position and motion.
  • the positioning and motion tracking of the player (i.e., one or more points on the player) and/or gaming object (i.e., one or more points on the gaming object) may be done with respect to the origin of the spherical coordinate system or with respect to each other.
  • the gaming object's position and motion may be determined with reference to the origin and the position and motion of the player may be determined with reference to the position and motion of the gaming object.
  • the player's position and motion may be determined with reference to the origin and the position motion of the gaming object may be determined with reference to the player's potion and motion.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking that begins at step 120 by determining environment parameters (e.g., the gaming environment) of the physical area in which the gaming object resides and/or in which the game system resides.
  • the environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • the method continues at step 122 by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 8-13 ).
  • a coordinate system e.g., one of the systems shown in FIGS. 8-13 .
  • the physical area is a room
  • a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • inanimate objects in the room e.g., a couch, a chair, etc.
  • the method continues at step 124 by determining the coordinates of the player's, or players', position in the physical area.
  • the method continues at step 126 by determining the coordinates of a gaming object's initial position.
  • the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is in close proximity to the player.
  • the initial position of the player may be used to determine the initial position of the gaming object.
  • one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method continues at step 128 by updating the coordinates of the player's, or players', position in the physical area to track the player's, or players', motion.
  • the method continues at step 130 by updating the coordinates of a gaming object's position to track its motion.
  • the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is in close proximity to the player.
  • the motion of the player may be used to determine the motion of the gaming object.
  • one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method of FIG. 22 may be performed by the game console device that begins with determining at least one positioning coordinate for the player with respect to an origin of the coordinate system and determining at least one positioning coordinate for the gaming object with respect to the at least one positioning coordinate for the player.
  • the method continues with the game console device determining at least one next positioning coordinate for the player with respect to the origin and determining at least one next positioning coordinate for the gaming object with respect to the at least one next positioning coordinate for the player.
  • the method continues with the game console device determining the motion of the player, with respect to the origin, based on the at least one positioning coordinate for the player and the at the least one next positioning coordinate for the player.
  • the method also includes the game console device determining the motion of the gaming object, with respect to the player, based on the at least one positioning coordinate for the gaming object and the at the least one next positioning coordinate for the gaming object.
  • FIG. 23 is a diagram of another method for determining position and/or motion tracking that begins at step 140 by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then continues at step 142 by determining a vector for a player's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 11-13 ). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • a coordinate system e.g., one of the systems shown in FIGS. 11-13
  • the method continues at step 144 by determining a vector of a gaming object's initial position.
  • the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is close proximity to the player.
  • the initial position of the player may be used to determine the initial position of the gaming object.
  • one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method continues at step 146 by updating the vector of the player's, or players', position in the physical area to track the player's motion.
  • the method continues at step 148 by updating the vector of the gaming object's position to track its motion.
  • the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is close proximity to the player.
  • the motion of the player may be used to determine the motion of the gaming object.
  • one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 24 is a diagram of another method for determining position and/or motion tracking that begins at step 150 by determining environment parameters of the physical area in which the gaming object lays and/or in which the game system lays.
  • the environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • the method continues at step 152 by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 16-18 ).
  • a coordinate system e.g., one of the systems shown in FIGS. 16-18 .
  • the physical area is a room
  • a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • objects in the room e.g., a couch, a chair, etc.
  • the method continues at step 154 by determining the coordinates of the gaming object's initial position in the physical area.
  • the method continues at step 156 by determining the coordinates of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method continues at step 156 by updating the coordinates of the gaming object's position in the physical area to track its motion.
  • the method continues at step 158 by updating the coordinates of the player's position to track the player's motion with respect to the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method of FIG. 24 may be performed by the game console device that begins with determining at least one positioning coordinate for the gaming object with respect to an origin of the coordinate system and determining at least one positioning coordinate for the player with respect to the at least one positioning coordinate for the gaming object. The method continues with the game console device determining at least one next positioning coordinate for the gaming object with respect to the origin and determining at least one next positioning coordinate for the player with respect to the at least one next positioning coordinate for the gaming object.
  • the method continues with the game console device determining the motion of the gaming object, with respect to the origin, based on the at least one positioning coordinate for the gaming object and the at the least one next positioning coordinate for the gaming object.
  • the method continues with the game console device determining the motion of the player, with respect to the gaming object, based on the at least one positioning coordinate for the player and the at the least one next positioning coordinate for the player.
  • FIG. 25 is a diagram of another method for determining position and/or motion tracking that begins at step 162 by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method continues at step 164 by determining a vector for a gaming object's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 19-21 ). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • a coordinate system e.g., one of the systems shown in FIGS. 19-21
  • step 166 The method continues at step 166 by determining a vector of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method continues at step 168 by updating the vector of the gaming object's position in the physical area to track its motion.
  • the method continues at step 70 by updating the vector of the player's position with respect to the gaming object's motion to track the player's motion. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 26 is a diagram of another embodiment of a coordinate system of a gaming system that is an extension of the coordinate systems discussed above.
  • the coordinate system includes a positioning coordinate grid 172 and a motion tracking grid 174 , where the motion tracking grid 174 has a finer resolution than the positioning coordinate grid 172 .
  • the player and/or gaming object may be positioned anywhere within the gaming environment at a given time, but, for a given time interval (e.g., 1 second), the player's and/or gaming object's position will be relatively fixed. However, within this relative stationary position, the player and/or gaming object may move (e.g., a head bob, slash of the gaming object, turn sideways, etc.) during the given time interval.
  • the low resolution (e.g., within a meter) of the positioning coordinate grid 172 can be adequately used to establish the player's and/or gaming object's relatively stationary positions for the given time interval.
  • the finer resolution (e.g., within a few millimeters) of the motion tracking grid 174 of is used at a higher interval rate (e.g., once every 10 mS) to accurately track the motion of the player and/or game object. Note that, once the relatively stationary position of the player and/or gaming object for the given time period is established, the motion tracking can be focused to the immediate area of the relatively stationary position.
  • FIG. 27 is a schematic block diagram of an embodiment of a wireless communication system that includes a plurality of access points 180 - 184 , a gaming console device 12 , a gaming object 14 , a device 186 , and a local positioning system (LPS) receiver 66 .
  • the LPS receiver 66 is associated with the gaming object 14 and/or with the player 16 .
  • the game console device 12 is coupled to the plurality of access points (AP) 180 - 184 and to at least one wide area network (WAN) connection (e.g., digital subscriber loop (DSL) connection, cable modem, satellite connection, etc.). In this manner, the game console device 12 may function as the bridge, or hub, for the WLAN to the outside world.
  • WAN wide area network
  • the access points 180 - 184 are positioned throughout a given area to provide a seamless WLAN for the given area (e.g., a house, an apartment building, an office building, etc.).
  • the device 186 may be any wireless communication device that includes circuitry to communicate with a WLAN.
  • the device may be a cell phone, a computer, a laptop, a PDA, a cordless phone, etc.
  • each access point 180 - 184 includes an accurate clock (e.g., an atomic clock) or is coupled to an accurate clock source to provide an accurate time standard for synchronization at any point in the physical area.
  • Each AP transmits a spread spectrum signal (s 1 ) containing a BPSK (Bi-Phase Switched keyed) signal in which 1's & 0's are represented by reversal of the phase of the carrier or a signal having some other format (e.g., FM, AM, QAM, QPSK, ASK, FSK, MSK).
  • This message is transmitted at a specific frequency at a “chipping rate” of x bits per second (e.g., 50 bits per second).
  • the signal may repeat every 10-30 millisecond (or longer duration) and it contains information regarding the entire LPS and information regarding the AP transmitting the signal.
  • the signal may be a very narrow pulse (e.g., less than 1 nanosecond), repeated at a desired rate (e.g., 1-100 KHz).
  • the LPS receiver 66 utilizes the signals to determine its position within a given coordinate system (See FIGS. 8-14 , 16 - 21 ). For instance, the LPS receiver 66 determines a time delay (e.g., t 1 , t 2 , and t 3 ) for at least some of the plurality of signals in accordance with the at least one clock signal. The LPS receiver 66 calculates a distance to a corresponding one of the plurality of APs based on the time delays of the signals (s 1 ).
  • a time delay e.g., t 1 , t 2 , and t 3
  • the LPS receiver 66 is calculating a time delay of the signal (s 1 ) received from the APs, or a subset thereof, (e.g., at a minimum three and preferably four) to triangulate its position in three-dimensional space. For instance, the LPS receiver 66 identifies each AP signal by its distinct code pattern, and then measures the time delay for each AP. To do this, the receiver 66 produces an identical signal sequence using the same seed number as the AP. By lining up the two sequences, the receiver 66 can measure the delay and calculate the distance to the AP.
  • the LPS receiver 66 determines the position of the corresponding plurality of APs based on the signals. For example, the LPS receiver 66 uses the position data of the signals to determine the APs' position. The LPS receiver 66 then determines its location based on the distance to the APs and the position of the APs. For instance, by knowing the position and the distance of an AP, the LPS receiver 66 can determine it's location to be somewhere on the surface of an imaginary sphere centered on that AP and whose radius is the distance to it. When four APs are measured simultaneously, the intersection of the four imaginary spheres reveals the location of the receiver. Often, these spheres will overlap slightly instead of meeting at one point, so the receiver will yield a mathematically most-probable position (and often indicate the uncertainty).
  • the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may be used to determine the relative position and to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • FIG. 28 is a schematic block diagram of another embodiment of a wireless communication system that includes a plurality of access points 180 - 184 , a gaming console device 12 , a gaming object 14 , and the device 186 .
  • the gaming object 14 and/or the player 16 may have associated therewith a local positioning system (LPS) transmitter 74 .
  • the game console device 12 is coupled to the plurality of access points (AP) 180 - 184 , which are positioned throughout a given area to provide a seamless WLAN for the given area (e.g., a house, an apartment building, an office building, etc.).
  • the game console device 12 is coupled to at least one wide area network (WAN) connection (e.g., DSL connection, cable modem, satellite connection, etc.). In this manner, the game console device may function as the bridge, or hub, for the WLAN to the outside world.
  • WAN wide area network
  • the LPS transmitter 74 includes an accurate clock and transmits a narrow pulse (e.g., pulse width less than 1 nano second) at a desired rate (e.g., once every milli second to once every few seconds).
  • the narrow pulse signal includes a time stamp of when it is transmitted.
  • the APs 180 - 184 receive the narrow pulse signal and determine their respective distances to the LPS transmitter 74 .
  • an AP determines the distance to the LPS transmitter 74 based on the time stamp and the time at which the AP received the signal. Since the narrow pulse travels at the speed of light, the distance can be readily determined.
  • the plurality of distances between the APs 180 - 184 and the LPS transmitter 74 are then processed to determine the position of the LPS transmitter 74 within the local physical area in accordance with the known positioning of the APs. For instance, with the known position and the distance of an AP to the LPS transmitter 74 , an AP can determine the LPS transmitter's location to be somewhere on the surface of an imaginary sphere centered on that AP and whose radius is the distance to it. When the distance to four APs is known, the intersection of the four imaginary spheres reveals the location of the LPS transmitter.
  • the processing of the AP to transmitter 74 distances may be performed by a master AP, by the game console device 12 , by a motion tracking processing module, and/or by an LPS computer coupled to the plurality of APs 180 - 184 .
  • the motion tracking processing module may be a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • the processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module.
  • a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64 .
  • the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may be used to determine the relative position and to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • FIG. 29 is a schematic block diagram of another embodiment of a wireless communication system that includes a plurality of LAN devices 912 - 196 , a WAN coupling device 190 , a game console device 12 , a gaming object 14 , and a player 16 .
  • Each of the LAN devices 192 - 196 which may be a wired device (e.g., includes an Ethernet network card, a fire wire interface, etc.) or a wireless device, includes an LPS module 198 - 202 and the gaming object 14 and/or the player 16 has associated therewith an LPS personal module 205 .
  • the LPS modules 198 - 202 include an LPS transmitter 60 - 64 and the LPS personal module 205 includes an LPS receiver 66 as described with reference to FIG. 6 .
  • the LPS modules 198 - 202 include an LPS receiver 68 - 72 and the LPS personal module 205 includes an LPS transmitter 74 .
  • the WAN coupling device 190 may be a cable modem, a DSL modem, a satellite receiver, a cable receiver, and/or any other device that provides a WAN connection 206 to a WAN network (e.g., the internet, a public phone system, a private network, etc.).
  • FIGS. 30 and 31 are top and side view diagrams of an embodiment of determining position and/or motion tracking using RF and/or microwave signaling.
  • a transceiver 32 (which may be included in the game console device, coupled to a game console, coupled to a remote game console, or coupled to a server via an WAN connection) transmits a plurality of beamformed signals at one or more frequencies (e.g., frequencies in the ISM band, 29 MHz, 60 MHz, above 60 GHz, and/or other millimeter wavelengths (MMW)) to sweep the physical area.
  • the transceiver 32 determines the reflected signal 212 energy and may also determine the refracted signal 216 energy.
  • the transceiver 32 may also determine the pass through signal 214 component. Since different objects reflect, refract, and/or pass through RF to MMW signals in different ways, the game console device 12 can identify an object based on the reflected, refracted, and/or pass through signal energies. For example, human beings reflect, refract, and/or pass through RF and MMW signals in a different way than inanimate objects such as furniture, walls, plastics, metals, clothing, etc.
  • a three dimension image of the physical area is obtained. Further analysis of the reflected, pass through, and/or refracted signals yields the distance to the transceiver 32 . From the distance for a plurality of beamformed signals, the position of the objects (including the player and the gaming object) may be determined. Note that more than one transceiver may be used to determine the three-dimensional image of the physical area and/or to determine positioning and/or motion tracking within the physical area.
  • Beamforming is discussed in a patent application entitled, “BEAMFORMING AND/OR MIMO RF FRONT-END AND APPLICATIONS THEREOF,” having a Ser. No. of 11/527,961, and a filing date of Sep. 27, 2006, which is incorporated herein by reference.
  • the transcevier 32 using MMW signaling can track the motion of the player and/or gaming object.
  • the wavelength of a 60 GHz signal is approximately 5 millimeters.
  • a ninety degree phase shift of the signal corresponds to a 1.25 millimeter movement.
  • a motion tracking rate e.g., once every 10-30 mS
  • FIG. 32 is a schematic block diagram of an embodiment of a transmitter 32 that includes a processing module 220 , one or more image intensity sensors 222 , and an RF transmitter 224 .
  • the RF transmitter 224 includes an oscillator 228 , a plurality of power amplifiers (PA) 230 - 232 , and a beamforming module 226 coupled to a plurality of antenna structures.
  • the plurality of antenna structures may be configurable antenna structures as discussed in patent application entitled, “INTEGRATED CIRCUIT ANTENNA STRUCTURE”, having a Ser. No. of 11/648,826, and a filing date of Dec. 29, 2006, patent application entitled, “MULTIPLE BAND ANTENNA STRUCTURE, having a Ser. No.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64 .
  • a beamformed signal 212 can be directed in any two or three dimensional direction within the physical area.
  • the desired frequency of the oscillation may be adjusted to provide a frequency spectrum sweep of the physical area.
  • the one or more image intensity sensors 222 measure the temperature of the objects, which is a function of the reflectivity, emissivity, and transmissivity of the surface of the physical area.
  • Emissivity is the ratio of the radiation intensity of a nonblack body to the radiation intensity of a blackbody. This ratio, which is usually designated by the Greek letter ⁇ , is always less than or just equal to one.
  • the emissivity characterizes the radiation or absorption quality of nonblack bodies. Published values are available for most substances. Emissivities vary with temperature and also vary throughout the spectrum. Transmissivity is the ratio of the transmitted radiation to the radiation arriving perpendicular to the boundary between two mediums.
  • the one or more image sensors provide the temperature of the object(s) to the processing module 220 .
  • the processing module 220 accumulates temperatures of the object(s) for various beamformed signals 212 and/or for various frequencies and processes the temperatures in accordance with an image intensity processing algorithm to provide a three dimensional image of the physical area and the objects in it.
  • the image intensity processing algorithm may further include a positioning and/or motion tracking sub routine to establish the positioning and/or motion tracking of a player and/or gaming object within the physical area.
  • the gaming object may be made of one or more materials that makes it readily distinguishable from other objects that may be found in the physical area. For example, it may be made of a combination of metals and plastics in a particular shape.
  • FIG. 33 is a diagram of another method for determining position and/or motion tracking that begins at step 240 by transmitting one of a plurality of beamformed signals.
  • the method continues at step 242 by receiving one or more image intensity signals (e.g., reflectivity, emissivity, and transmissivity of the surface of the physical area) for the given beamformed signal.
  • the method then branches to step 246 and step 248 .
  • the likely material of the object(s) is determined based on the received one or more image intensity signals.
  • the distance to the object(s) is determined based on the received one or more image intensity signals.
  • the method continues at step 248 by determining whether all of the beamformed signals have been processed (e.g., different angles and/or at different frequencies). If not, the process repeats by transmitting one of the beamformed signals.
  • FIG. 34 is a diagram of another method for determining position and/or motion tracking that may begin at step 260 with the optional step of adjusting frequency (e.g., in the MMW band) of the beamforming signals for optimal human imaging.
  • the method continues at step 262 by updating the beamforming coefficients based on the player's and/or gaming objects position. With this step, or these steps, the transceiver is focused on tracking the motion of the player and/or gaming object.
  • the method continues at step 264 by transmitting one of the beamforming signals and at step 266 by receiving one or more image intensity signals in response to the focused beamformed signal.
  • the method then continues at step 268 by determining a distance to the object based on the received one or more image intensity signals. If all of the beamforming signals have not been transmitting as determined at step 270 , the method repeats at step 264 by transmitting the next beamforming signal.
  • the method continues at step 272 by compiling distances to establish the player's and/or gaming objects motion.
  • the method continues at step 274 by determine whether it is time to update the position of the player and/or gaming object.
  • the motion tracking processing may be repeated every 10-100 mSec and the positioning may be updated once every 1-10 seconds.
  • the positioning may be updated to keep the player and/or gaming object within a desired processing region. For example, with reference to FIG. 26 , the motion tracking grid is moved based on the updated positioning such that the focusing of the beamforming signals is concentrated on the motion tracking grid.
  • the method repeats. If it is time to update the positioning, the method of FIG. 33 may be used.
  • FIG. 35 is a schematic block diagram of an embodiment of a wireless communication between a gaming object 14 and a game console device 12 .
  • the gaming object 14 and the game console device 12 each includes a plurality of antenna structures.
  • the antenna radiation pattern for the plurality of structures may be as shown in FIG. 36 .
  • the antenna arrays will receive their respective signals with differing signal characteristics (signal strength, phase, beam angle, constructive and destructive interference of the signals, etc.), based on the orientation of the gaming object 14 with respect to the game console device 12 .
  • signal characteristics signal strength, phase, beam angle, constructive and destructive interference of the signals, etc.
  • the movement of the gaming object 14 may be determined.
  • the game console device 12 may transmit the signals and the gaming object 14 determines the signal characteristics.
  • FIG. 37 is a diagram of another embodiment of an antenna radiation pattern for first and second antennas for first and second frequencies. The diagram further illustrates a source position of the transmitted signals.
  • the f 2 antennas are orthogonal with each other an at a 45 degree relationship with the f 1 antennas, which are orthogonal to each other.
  • FIG. 38 is a diagram of an example of receiving the RF and/or MMW signals by the various antennas of the antenna arrays.
  • f 1 antennas receive the transmitted RF and/or MMW signal [e.g., A 1 cos( ⁇ f1 (t))] with different characteristics.
  • the received signals of the f 1 antennas are combined to produce a first resulting signal [e.g., A′ 1 cos( ⁇ f1 (t)+ ⁇ 1 + ⁇ 1 ), where A′ 1 is the received amplitude, ⁇ is the beam angle, and ⁇ is the phase rotation.
  • f 2 antennas receive the transmitted RF and/or MMW signal [e.g., A 2 cos( ⁇ f2 (t))] with different characteristics.
  • the received signals of the f 2 antennas are combined to produce a second resulting signal [e.g., A′ 2 cos( ⁇ f2 (t)+ ⁇ 2 + ⁇ 2 ).
  • the resulting signals can be processed to determine the beam angle, phase angle, and amplitude of the transmitted signals. From this information, the position and/or motion tracking may be determined.
  • FIGS. 40 and 41 are diagrams of an example of frequency dependent distance calculation where the phase difference at different times for different signals is determined.
  • the positioning and/or motion tracking of an object may be done based on the phase difference, the transmission distance, and the frequency of the signals from time to time.
  • the transmitter transmits a signal as shown in FIG. 40 .
  • the first antenna receives the signal.
  • the phase rotation (e.g., ⁇ 0-1 ) of the received signal is determined.
  • the second antenna receives the signal.
  • the phase rotation (e.g., ⁇ 0-2 ) of the received signal is determined.
  • With a known distance between the first and second antennas, the different phase rotations, and the carrier frequency of the signal the distance between the transmitter and receiver can be determined. Using the beam angle, the orientation of the distance can be determined.
  • FIGS. 42 and 43 are diagrams of an example of constructive and destructive signaling to facilitate the determination of positioning and/or motion tracking.
  • At least two antennas physically separated by a known distance transmit different sinusoidal signals [e.g., cos( ⁇ f1 (t)) and cos( ⁇ f2 (t))]].
  • An antenna assembly of the gaming object and/or player receives the signals and, based on the constructive and destructive patterns, the distance may be determined. Obtaining multiple distances from multiple sources and knowing the source locations, the position and/or motion of the object can be determined. Such a process may be augmented by using the attenuation properties of a signal in air and/or by using multiple different frequency signals.
  • FIG. 44 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a gaming object 14 , and a plurality of digital image sensors 290 - 294 (e.g., digital cameras, digital camcorders, digital image sensor, etc.).
  • the gaming system 10 has an associated physical area in which the gaming object 14 and player 16 are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • the plurality of digital imaging sensors 290 - 294 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures of an image of the player 16 and/or gaming object 14 within the physical area based on the position of the player and/or gaming object.
  • the digital imaging sensors 290 - 294 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • the captured images are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the digital image sensors may be positioned and/or adjusted to focus on the player's and/or gaming object's movement. The images captured by the digital image sensors are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player 16 and/or gaming object 14 may include sensors (e.g., blue screen patches, etc.) thereon to facilitate the position and/or motion tracking processing.
  • sensors e.g., blue screen patches, etc.
  • FIG. 45 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a gaming object 14 , and a plurality of heat sensors 300 - 304 (e.g., infrared thermal imaging cameras, infrared radiation thermometer, thermal imager, ratio thermometers, Optical Pyrometer, fiber optic temperature sensor, etc.).
  • the gaming system has an associated physical area in which the game gaming object and player are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • the plurality of heat sensors 300 - 304 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures a heat image of the player 16 and/or gaming object 14 within the physical area based on the position of the player and/or gaming object.
  • the heat sensors 300 - 304 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • the captured heat images are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the heat sensors may be positioned and/or adjusted to focus on the player and/or gaming object movement. The heat images captured by the heat sensors are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include sensors thereon to facilitate the position and/or motion tracking processing.
  • the plurality of electromagnetic sensors 310 - 314 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures of an electromagnetic image of the player and/or gaming object within the physical area based on the position of the player and/or gaming object.
  • the electromagnetic sensors 310 - 314 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • the captured electromagnetic images are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the electromagnetic sensors may be positioned and/or adjusted to focus on the player and/or gaming object movement. The electromagnetic images captured by the electromagnetic sensors are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include sensors thereon to facilitate the position and/or motion tracking processing.
  • FIG. 47 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a gaming object 14 , and a plurality of laser sensors 320 - 324 (e.g., Laser Distance Measurement Photoelectric Sensors, digital laser sensor, short range laser sensor, medium range laser sensor, etc.).
  • the gaming system has an associated physical area in which the game gaming object 14 and player 16 are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • the plurality of laser sensors 320 - 324 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures laser based relative distances of the player and/or gaming object within the physical area based on the position of the player and/or gaming object.
  • the laser sensors 320 - 324 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • the relative distances are initially used to determine the position of the gaming object 14 and/or the player 16 . Once the player's and/or gaming object's position is determined, the laser sensors may be positioned and/or adjusted to focus on the player and/or gaming object movement. Subsequent relative distances are processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include sensors thereon to facilitate the position and/or motion tracking processing.
  • FIG. 48 is a diagram of another method for determining position and/or motion tracking that begins at steps 330 and 332 by determining the relative position of the player and/or gaming object using two or more positioning techniques (e.g., RF beamforming, laser sensors, etc.) The method continues at step 334 by combining the two or more positions to produce the initial position.
  • the two or more positioning techniques may be weighted based on a variety of factors including, but not limited to, accuracy, distance, interference, availability, etc.
  • one technique may be used to capture the position in one plane (e.g., x-y plane), a second technique may be used to capture the position in a second plane (e.g., x-z plane), and/or a third technique may be used to capture the position in a third plane (e.g., y-z plane).
  • one plane e.g., x-y plane
  • a second technique may be used to capture the position in a second plane
  • a third technique may be used to capture the position in a third plane (e.g., y-z plane).
  • step 342 by determining whether the position needs to be updated (e.g., change focus of motion tracking processing). If yes, the method repeats at steps 330 and 332 . If not, the method repeats at steps 336 and 338 .
  • FIG. 49 is a diagram of another method for determining position and/or motion tracking that begins at step 350 by evaluating the physical environment in which the player and/or gaming object are located.
  • the game console may also be located in the physical environment, which may be a room, a portion of a room, an office, and/or any area in which a player can player a video game.
  • the method continues at step 352 by selecting one or more of a plurality of positioning techniques for determining the position of the player and/or gaming object based on the physical environment.
  • the method continues at step 354 by determining the position of the player and/or gaming object using the one or more positioning techniques.
  • the method continues at step 356 by selecting one or more of motion tracking techniques to determine the motion of the player and/or gaming object based on the environment and/or the position of the player and/or gaming object.
  • the method continues at step 358 by determining the motion of the player and/or gaming object using the selected motion tracking technique(s).
  • the method continues at step 360 by determining whether the position of the player and/or gaming object needs to be updated and repeats as shown.
  • FIG. 50 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a gaming object 14 , an RFID reader 370 , at least one RFID tag 372 associated with the player 16 , and at least one RFID tag 372 associated with the gaming object 14 .
  • the gaming system has an associated physical area in which the game gaming object and player are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • the RFID reader 370 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) communicates with the RFID tags 372 to determine distances of the player 16 and/or gaming object 14 within the physical area. This may be done by using the RFID system (e.g., the reader and the tags) as an RF radar system.
  • the RFID system may use a backscatter technique to determine distances between the RFID reader and the RFID tags.
  • the RFID system may use frequency modulation to compare the frequency of two or more signals, which is generally more accurate than timing the signal. By changing the frequency of the returned signal and comparing that with the original, the difference can be easily measured.
  • the RFID system may use a continuous wave radar technique.
  • a “carrier” radar signal is frequency modulated in a predictable way, typically varying up and down with a sine wave or sawtooth pattern at audio frequencies or other desired frequency.
  • the signal is then sent out from one antenna and received on another and the signal can be continuously compared. Since the signal frequency is changing, by the time the signal returns to the source the broadcast has shifted to some other frequency. The amount of that shift is greater over longer times, so greater frequency differences mean a longer distance. The amount of shift is therefore directly related to the distance traveled, and can be readily determined.
  • This signal processing is similar to that used in speed detecting Doppler radar.
  • the distances are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the RFID system may be adjusted to focus on the player and/or gaming object movement. Subsequently determined distances are processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or of the player.
  • FIG. 51 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a gaming object 14 , a plurality of RFID readers 370 , at least one RFID tag 372 associated with the player 16 , and at least one RFID tag 372 associated with the gaming object 14 .
  • the gaming system 10 has an associated physical area in which the game gaming object and player are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • one or more of the RFID readers 370 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) communicates with one or more of the RFID tags 372 to determine the distances of the player 16 and/or gaming object 14 within the physical area.
  • This may be done by using the RFID system (e.g., the readers and the tags) as an RF radar system.
  • the RFID system may use a backscatter technique to determine distances between the RFID reader and the RFID tags.
  • the RFID system may use frequency modulation to compare the frequency of two or more signals, which is generally more accurate than timing the signal. By changing the frequency of the returned signal and comparing that with the original, the difference can be easily measured.
  • the RFID system may use a continuous wave radar technique.
  • a “carrier” radar signal is frequency modulated in a predictable way, typically varying up and down with a sine wave or sawtooth pattern at audio frequencies or other desired frequency.
  • the signal is then sent out from one antenna and received on another and the signal can be continuously compared. Since the signal frequency is changing, by the time the signal returns to the source the broadcast has shifted to some other frequency. The amount of that shift is greater over longer times, so greater frequency differences mean a longer distance. The amount of shift is therefore directly related to the distance traveled, and can be readily determined.
  • This signal processing is similar to that used in speed detecting Doppler radar.
  • the distances are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the RFID system may be adjusted to focus on the player and/or gaming object movement. Subsequently determined distances are processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or of the player.
  • FIG. 52 is a schematic block diagram of a side view of another embodiment of a gaming system 10 that includes a game console device 12 , a gaming object 14 , one or more RFID readers 370 , a plurality of RFID tags 372 associated with the player 16 , and a plurality of RFID tags 372 associated with the gaming object 14 .
  • the gaming system has an associated physical area in which the gaming object and player are located.
  • the player 16 and the gaming object 14 are within the determined relative position 378 .
  • the one or more RFID readers 370 transmits an RFID reader transmission 374 , which may be in accordance with an RF radar transmission as discussed above.
  • the RFID reader transmission 374 may be request for at least one of the RFID tags 372 to provide a response regarding information to determine its position or distance with reference to a particular point.
  • the RFID tags provide an RFID tag response 376 , which may be in accordance with the RF radar transmissions discussed above. Alternatively, the RFID tags may provide a response regarding information to determine its position or its distance to a reference point.
  • the communication between the RFID reader(s) and RFID tags may be done in a variety of ways, including, but not limited to, a broadcast transmission and a collision detection and avoidance response scheme, in a round robin manner, in an ad hoc manner based on a desired updating rate for a given RFID tag (e.g., a slow moving tag needs to be updated less often than a fast moving tag), etc.
  • FIG. 53 is a schematic block diagram of an embodiment of an RFID reader 370 in the game console device 12 and an RFID tag 372 in the gaming object 14 .
  • the RFID reader 370 includes a protocol processing module 380 , an encoding module 382 , a digital to analog converter 384 , an RF front-end 386 , a digitization module 388 , a pre-decoding module 390 , and a decoding module 392 .
  • the RFID tag 372 includes a power generating circuit 394 , an envelop detection module 396 , an oscillation module 398 , an oscillation calibration module 400 , a comparator 402 , and a processing module 404 .
  • RFID reader 370 The details of the RFID reader 370 are disclosed in patent application entitled RFID READER ARCHITECTURE, having a Ser. No. of 11/377,812, and a filing date of Mar. 16, 2006 and the details of the RFID tag 372 are disclosed in patent application entitled POWER GENERATING CIRCUIT, having a Ser. No. of 11/394,808, and a filing date of Mar. 31, 2006. Both patent applications are incorporated herein by reference.
  • FIG. 54 is a diagram of a method for determining position of a player and/or gaming object that begins at step 410 with an RFID reader transmitting a power up signal to one or more RFID tags, which may be active or passive tags.
  • the power up signal may be a tone signal such that a passive RFID tag can generate power therefrom.
  • the power up signal may be a wake-up signal for an active RFID tag.
  • the method continues at step 412 with the RFID tag providing an acknowledgement that it is powered up. Note that this step may be skipped.
  • the method continues at step 414 with the RFID reader transmitting a command at time t 0 , where the command requests a response to be sent at a specific time after receipt of the command.
  • an RFID tag provides the response and, at step 416 , the reader receives it.
  • the method continues at step 418 with the RFID reader recording the time and the tag ID.
  • the method continues at step 420 with the reader determining the distance to the RFID tag based on the stored time, time t 0 , and the specific time delay.
  • the method continues at step 422 by determining whether all or a desired number of tags have provided a response. If not, the method loops as shown. If yes, the method continues at step 424 by determining the general position of the player and gaming object based on the distances. As an alternative, the general position of each of the tags may be determined from their respective distances at step 426 . Note that at least three, and preferably four, distances need to be accumulated from different sources (e.g., multiple RFID readers or an RFID reader with multiple physically separated transmitters) to triangulate the RFID tag's position.
  • sources e.g., multiple RFID readers or an RFID reader with multiple physically separated transmitters
  • FIG. 55 is a schematic block diagram of an embodiment of a gaming object 14 that includes an integrated circuit (IC) 434 , a gaming object transceiver 432 and a processing module 430 .
  • the IC 434 includes one or more of an RFID tag 446 , a servo motor 448 , a received signal strength indicator 444 , a pressure sensor 436 , an accelerometer 438 , a gyrator 440 , an LPS receiver 442 , and an LPS transmitter 445 .
  • the gaming object 14 may not include the processing module 430 and/or the gaming object transceiver 432 .
  • the RFID tag is coupled to one or more antenna assemblies and the gaming object transceiver is also coupled to one or more antenna assemblies.
  • the RFID tag may communicate with an RFID reader using one or more carrier frequencies to facilitate positioning and/or tracking as described above.
  • the RFID tag may provide the communication path for data generated by the RSSI module, the servo motor, the pressure sensor, the accelerometer, the gyrator, the LPS receiver, and/or the LPS transmitter. Details of including a gyrator or pressure sensor on an IC is provided in patent application entitled GAME DEVICES WITH INTEGRATED GYRATORS AND METHODS FOR USE THEREWITH, having a Ser. No. of 11/731,318, and a filing date of Mar.
  • the RFID tag may use a different frequency than the gaming object transceiver for RF communications or it may use the same, or nearly the same, frequency. In the latter case, the frequency spectrum may be shared using a TDMA, FDMA, or some other sharing protocol. If the RFID tag and the gaming object transceiver share the frequency spectrum, they may share the antenna structures. Note that the antenna structures may be configurable as discussed in patent application entitled, “INTEGRATED CIRCUIT ANTENNA STRUCTURE”, having a Ser. No. of 11/648,826, and a filing date of Dec. 29, 2006, patent application entitled, “MULTIPLE BAND ANTENNA STRUCTURE, having a Ser. No. of 11/527,959, and a filing date of Sep.
  • FIG. 56 is a schematic block diagram of an embodiment of three-dimensional antenna structure 350 that includes at least one antenna having a radiation pattern along each of the three axes (x, y, z). Note that the 3D antenna structure 350 may include more than three antennas having radiation patterns at any angle within the three-dimensional space. Note that the antennas may be configurable antennas as previously discussed to accommodate different frequency bands.
  • FIG. 57 is a diagram of an example of an antenna radiation pattern 352 for one of the antennas of the antenna structure 350 of FIG. 56 .
  • FIGS. 58 and 59 are diagrams of an example of frequency dependent motion calculation where a signal (TX) is received at time tn and another signal (TX) is received at time tn+1, where n is any number.
  • TX signal
  • TX time tn
  • TX time tn+1
  • n any number.
  • the signal is received with respect to the xy plane and with respect to the xz plane by the three antennas of FIG. 56 .
  • each antenna will receive the signal with a different amplitude (and may be a different phase as well) due to its angle with respect to the source of the signal. From these differing received signals, the angular direction of the source with respect to the 3D antenna structure can be determined.
  • one or more of the distance determination techniques discussed herein may be used (e.g., attenuation of the magnitude of the transmitted signal).
  • the position of the 3D antenna structure which may be affiliated with a player and/or gaming object, can be determined for time tn.
  • FIG. 59 shows the signal being received at time tn+1, which is at a different angle than the signal transmitted at time tn.
  • the differing received signals by the antennas are used to determine the angular position of the source and one or more of the distance determination techniques discussed herein may be used to determine the distance to the source. From the known angular position and the known distances, the position of the 3D antenna structure may be determined for time tn+1. Comparing the position of the 3D antenna structure at time tn with its position at time tn+1 yields its motion.
  • FIG. 60 is a diagram of a method for determining motion that begins at step 360 by transmitting an RF signal at time tn.
  • the RF signal may be a narrow pulse, may be a sinusoidal signal, and/or may be an RF transmission in accordance with a wireless communication protocol.
  • the method continues at step 362 with the 3D antenna structure receiving the RF signal.
  • the method continues at step 364 by determining a 3D vector of the received RF signal. An example of this is shown in FIG. 61 .
  • the method continues at step 366 by transmitting another RF signal at time tn+1.
  • the method continues at step 368 with the 3D antenna structure receiving the RF signal.
  • the method continues at step 370 by determining a 3D vector of the received RF signal. An example of this is shown in FIG. 61 .
  • the method continues at step 372 by determining the motion of the player and/or gaming object by comparing the two 3D vectors. This process continues for each successive tn and tn+1 combination. Note that the duration between tn and tn+1 may vary depending on one or more of the video game being played, the speed of motion, the anticipated speed of motion, the quality of the motion estimation, and/or motion prediction algorithms, etc.
  • FIG. 62 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12 , a player 16 , a gaming object 14 , and a plurality of directional microphones 280 .
  • the gaming system has an associated physical area in which the game gaming object and player are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • the plurality of directional microphones 380 - 382 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures audible, near audible, and/or ultrasound signals (together, acoustic waves) of the player 16 and/or gaming object 14 within the physical area.
  • the directional microphones 380 - 382 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • the captured audible, near audible, and/or ultrasound signals are used to determine the initial position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the directional microphones 380 - 382 may be positioned and/or adjusted to focus on the player and/or gaming object movement. The captured signals are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include near audible and/or ultrasound signal generators thereon facilitating the position and/or motion tracking processing.
  • FIG. 63 is a diagram of an example of audio, near audio, and ultrasound frequency bands that may be used by the system of FIG. 63 .
  • a positioning tone e.g., a sinusoidal signal
  • the microphones may serve a dual purpose: capturing audio for normal game play, game set up, game authentication, player authentication, gaming object authentication, and for position determination and motion tracking.
  • the gaming object and/or the player may transmit a near audible signal (e.g., a tone at 25 KHz), which is above the audible frequency range, but within the bandwidth of the directional microphones 380 - 382 .
  • the directional microphones may adjust their position to focus in on the source of the tone. The angular positioning and the intersection thereof may be used to determine the location of the gaming object and/or the player.
  • FIG. 64 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a gaming object 14 , a player 15 , a directional microphone 390 , and a game console device 12 .
  • the game console device 12 and/or the game object 14 include one or more directional microphones (and may include transmitters) 390 that have their orientation adjusted based on the position and/or motion of the gaming object to better receive an audible signal from the gaming object, player, and/or the game console device.
  • the gaming object 14 may also include a sound energy transmitter.
  • a position of the gaming object may be determined.
  • sound energy from the transmitters 390 is reflected by the user and/or the gaming object. Based upon the receipt of such sound energy, a position of the gaming object and/or the user may be determined.
  • FIG. 65 is a schematic block diagram of an overhead view of still another embodiment of a position location system in accordance with the present invention.
  • the position location system of FIG. 65 includes first position location sub-system, second position location sub-system, and processing circuitry that is coupled to the first position location sub-system and to the second position location sub-system.
  • the position location system illustrated in FIG. 65 is deployed within a gaming environment 6502 .
  • the position location system of FIG. 65 may be deployed in a physical area that does not support gaming. In such case, the position location system of FIG. 65 would simply be used to locate objects other than gaming objects within the physical area.
  • the video gaming system 6500 supports video gaming within the video gaming environment 6502 of the physical area. Consistently with the previous description made herein, the video gaming system 6500 includes a game console 12 , a gaming object 14 , and a gaming object 15 . During play, a player 16 holds one or both the gaming objects 14 and 15 and the position location system performs position 18 and motion tracking 20 of the objects 14 and/or 15 and/or the player 16 . By performing such position 18 and motion tracking 20 , the position location system supports the player 16 interacting with a gaming function supported by game console 12 .
  • the position location system of FIG. 65 includes at least two position location sub-systems, at least two of which use differing position location techniques.
  • the position location system includes first position location sub-system having position location sub-system components 6504 A, 6504 B, and 6504 C.
  • the position location system includes a second position location sub-system having position location sub-system components 6506 A, 6506 B, and 6506 C.
  • each of these position location sub-system components couples to the game console 12 via wired and/or wireless communication links.
  • the game console 12 includes processing circuitry that receives position location information from the at least two position location sub-systems and processes such information to locate one or more of gaming object 14 , gaming object 15 , and player 16 within the gaming environment 6502 .
  • processing circuitry locates one or more objects within the physical area without supporting gaming.
  • each of the first position location sub-system and the second position location sub-system may each include a plurality of receivers that orient about the gaming environment/physical area 6502 .
  • the first position location sub-system may include at least one transmitter.
  • the at least one transmitter may be located in conjunction with the gaming object 14 or 15 and/or may be co-located with the various receivers of the first and second position location sub-systems that are located about the gaming area 6502 .
  • position location sub-system components are distributed about the gaming area, e.g., 6504 A- 6506 C and may have substantially co-located receivers and transmitters.
  • the first position location sub-system uses a first position location technique while the second position location sub-system uses a second position location technique that differs from the first position location technique.
  • the various position location techniques that may be employed by the first and second position location sub-systems of FIG. 65 have been described previously herein with reference to FIGS. 1-64 .
  • these techniques include one or more of acoustic wave detection, RF signal detection, digital imaging, IR detection, laser distance measurement, thermal imaging, and/or multiple access accelerometer sensing.
  • the object 14 and/or 15 may include at least one acoustic energy source, e.g., ultrasonic transmitter, and the first position location sub-system may include a plurality of sound energy receivers that are located about the gaming environment 6502 .
  • acoustic energy source e.g., ultrasonic transmitter
  • the first position location sub-system may include a plurality of sound energy receivers that are located about the gaming environment 6502 .
  • the object 14 or 15 includes at least one RF transmitter and one or more of the first and second position location sub-systems include a plurality of receivers.
  • the object 14 or 15 includes at least one RF receiver and the first and/or the second position location sub-systems include a plurality of RF transmitters.
  • the first and/or second position location sub-systems include at least one RF transmitter and a plurality of RF receivers.
  • one or more gaming objects 14 and/or 15 include(s) a plurality of digital cameras. This technique, as was previously described herein, uses the digital cameras of the gaming objects 14 and/or 15 to recognize reference points in the gaming environment 6502 and to determine position(s) of the object(s) 14 and/or 15 s position based upon these reference points.
  • the first and/or second position location sub-systems include a plurality of digital cameras. The first and/or second position location sub-systems identify reference points, including object reference points, captured in the digital images to locate gaming object 14 and/or 15 .
  • the object may include an IR source and the first and/or second position location sub-systems include a plurality of IR receivers.
  • the first and/or second position location sub-systems include at least one IR source and a plurality of IR receivers. Any of these various techniques may be employed with the position location sub-systems and illustrated further herein with reference to FIGS. 66-73 .
  • FIG. 66 is a schematic block diagram of an overhead view of yet another embodiment of a position location system in accordance with the present invention.
  • a position location system includes a first position location sub-system, a second position location sub-system, and processing circuitry that couples to the first position location sub-system and to the second position location sub-system.
  • the first position location sub-system includes a plurality of position location sub-system components 6604 A, 6604 B, and 6604 C.
  • the second position location sub-system includes a plurality of position location sub-system components 6606 A, 6606 B, and 6606 C.
  • the position location sub-system(s) components couple to the processing circuitry of game console 12 via a wired and/or wireless communication link.
  • the first position location sub-system is operable to determine first position location information regarding a first gaming object 14 using a first position location technique.
  • the second position location sub-system is operable to determine second position location information regarding a second object 52 .
  • the second position location sub-system uses a second position location technique that differs from the first position location technique.
  • various position location techniques may be employed in accordance with the present invention.
  • the first position location sub-system includes components 6604 A- 6604 C that use a position location technique that differs from a second position location technique used by second position location sub-system that includes component 6606 A- 6606 C.
  • the game console 12 includes processing circuitry coupled to both the first position location sub-system and to the second location sub-system via wired and/or wireless couplings.
  • the processing circuitry of gaming console 12 processes the first position location information to determine a position of the first object 14 within a coordinate system. Further, the processing circuitry processes the second position location information to determine a position of the second object 52 within the coordinate system.
  • the coordinate system is associated with the physical environment within which the position location system is deployed. When the position location system operates in conjunction with a video gaming system, the coordinate system in which objects 14 and 52 are located is related to a video gaming function.
  • the location of gaming objects 14 and 52 are related to the gaming environment to incorporate the gaming functions and operations as well as positions of players 16 and 50 into the video game function.
  • the first position location sub-system is used to determine position 18 and motion track 20 player 16 and/or gaming object 14 .
  • the second position location sub-system is employed to determine position 54 and/or motion tracking 56 of player 50 and/or gaming object 52 .
  • the coordinate system used with the system of FIG. 66 may include a three-dimensional Cartesian coordinate system or a spherical coordinate system.
  • the position location sub-systems of FIG. 66 include receivers and/or transmitters deployed about a physical area that are operable to locate players 16 and 50 and/or gaming objects 14 and/or 52 within the physical area. Further operations of the position location system of FIG. 66 will be described further herein with reference to FIGS. 71-73 . Generally, the operations described herein with reference to FIGS. 67-73 may be employed with one or both of the systems of FIGS. 65 and 66 .
  • FIG. 67 is a flow chart illustrating operations of a position location system employing multiple position location techniques.
  • the position location system first evaluates its physical environment (Step 670 ).
  • the position location sub-system may perform calibration operations. Such calibration operations may be performed according to techniques previously described herein and that will be described herein with reference to FIG. 68 .
  • Operation proceeds with capturing first position location information regarding the object using a first position location sub-system that uses a first position location technique (Step 672 ).
  • the system of FIGS. 65 and/or 66 may be employed with the operations of FIG. 67 to locate the object using first position location sub-system.
  • Operation proceeds to capturing second position location information regarding the object by a second position location sub-system using the second position location technique (Step 674 ).
  • processing circuitry or another processing device processes the first position location information and the second position location information to determine a position of the object within a coordinate system (Step 676 ).
  • the coordinate system may correspond to a gaming environment, a factory, an office, a shopping mall, or any other physical area within which objects may be located.
  • the positions of the object within the coordinate system is integrated into a video game function (Step 678 ).
  • the first position location information and the second position location information are used in differing manners.
  • the first position location information may be used as primary information to locate the object while the second position location information may be used as secondary information to locate the object.
  • the second position location information is used as a safe guard or resolution enhancement to error check or increase resolution the first position location information.
  • the second position location information may be used to calibrate the first position location information. Such calibration may occur at startup and/or during standard intervals of operation of the position location system.
  • the second position location information is simply used to augment the first position location information.
  • An example of augmentation use of the second position location information occurs when the first position location information is interrupted intermittently or infrequently. In such case, the second position location information would fill-in the missing first position location information.
  • augmentation of the first position location information with the second position location information occurs at different points in the gaming operation when additional resolution, enhanced motion detection, greater location position, or another operation occurs in which a single position location technique is insufficient for the current demands.
  • FIG. 68 is a flow chart illustrating usage of multiple position location techniques for locating an object.
  • operation begins with evaluating the physical environment (Step 680 ).
  • the first position location sub-system captures the first position location information regarding the object using a first position location technique (Step 682 ).
  • the second position location sub-system captures the second position location information regarding the object using the second position location technique (Step 684 ).
  • the processing circuitry then calibrates the first position location system using the second position location information to produce calibration settings (Step 686 ). From Step 686 , operation ends.
  • the operations of FIG. 68 may be employed at startup, periodically, or when a lack of acceptable calibration is detected by the position locations system.
  • FIG. 69 is a flow chart illustrating usage of multiple position location techniques for determining position and motion of an object.
  • the operations of FIG. 69 commence with position location systems evaluating the physical environment within which the position location system operates (Step 690 ).
  • the first position location sub-system then captures the first position location information regarding the object using the first position location technique (Step 692 ).
  • Operation proceeds to capturing second position location information regarding the object by the second position location sub-system using a second position location technique (Step 694 ).
  • the processing circuitry determines the position of the object using the first position location information (Step 696 ).
  • the processing circuitry determines a motion of the object using the second position location information (Step 698 ).
  • One particular alternate embodiment of the operations of FIG. 69 includes using one position location technique that is very good at determining the position of the object but not as good at determining motion.
  • One example of such operation is using an acoustic wave detection technique to locate at an object within a gaming environment but to use multiple access accelerometer sensing to determine motion of the object.
  • an RF signal detection technique could be used to locate the object while using the multiple access accelerometer to detect motion of the object. In such case, a very high quality capture of both position and motion would result.
  • FIG. 70 is a flow chart illustrating operation for using multiple position location techniques to determine position and orientation of an object.
  • the operations of FIG. 70 commence with the position location system evaluating the physical environment (Step 700 ). Then, the first position location sub-system captures first position location information regarding the object using the first position location technique (Step 702 ). The second position location sub-system then captures second position location information regarding the object using a second position location technique (Step 704 ). The processing circuitry then determines the position of a first reference point of the object using the first position location information (Step 706 ).
  • the processing circuitry next determines the position of a second reference point of the object using the second position location information (Step 708 ). Then, the processing circuitry determines a position of the object using the first and/or second position location information (Step 706 ). Finally, the position location system determines an orientation of the object using the first and/or second position location information (Step 708 ).
  • the gaming object 14 may include multiple reference points and the player 16 may wear a plurality of sensing tags 44 .
  • the various reference points, e.g., sensing tags 44 worn by player 16 and/or multiple reference points of gaming object 14 may be separately tracked using two different position location techniques.
  • one reference point, e.g., a sensing tag 44 located on an arm or head of the player 16 may be used to locate the player while information captured regarding differing sensing tags 44 of the player 16 may be used to determine an orientation of the player within the gaming environment.
  • the first position location technique may be used to determine a position of one of the sensing tags 44 and the second position location technique may be used to determine location of a second sensing tag on the gaming object 14 .
  • both the position and orientation of gaming object 14 may be determined.
  • FIG. 71 is a flow chart illustrating operation for using multiple position location techniques to determine positions of multiple objects.
  • the operation of FIG. 71 commences with the position location system evaluating a physical environment in which the position location system is deployed (Step 710 ). Operation continues with the first position location sub-system capturing first position location information regarding a first object using a first position location technique (Step 712 ). Operation continues with the second position location sub-system capturing second position location information regarding a second object using a second position location technique (Step 714 ). The processing circuitry of the position location system then processes the first position location information to determine a position of the first object within a coordinate system (Step 716 ). The coordinate system would have been established at Step 710 and may be included with a video game function as has been previously described in great detail with reference to the present invention.
  • Operation continues with the processing circuitry processing the second position location information to determine a position of a second object within the coordinate system (Step 718 ).
  • a system in which multiple gaming object positions are tracked was previously described herein with reference to FIG. 66 .
  • Operation continues in FIG. 71 with the processing circuitry integrating the positions of the first and second objects within the coordinate system into a video game function (Step 719 ).
  • the video game function operations will be employed when the position location sub-system operates in conjunction with the video game function. When the position location system is not used in conjunction with the video game function, Step 719 would not occur.
  • FIG. 72 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects.
  • the operation of FIG. 72 commence with the position location system evaluating the physical environment (Step 720 ).
  • the position location system may establish coordinate system within a physical environment.
  • operation continues with the first position location sub-system capturing first position location information regarding a first object using a first position location technique (Step 722 ).
  • Operation continues with the second position location sub-system capturing second position location information regarding a second object using a second position location technique (Step 723 ).
  • the processing circuitry or gaming console processes the first position location information to determine a position of the first object within a coordinate system (Step 724 ).
  • Operation continues with the position location system processing second position location information to determine a position of the second object within the coordinate system (Step 725 ).
  • the processing circuitry determines motion of the first object using the first position location information (Step 726 ).
  • the processing circuitry determines a motion of the second object using the second position location information (Step 727 ).
  • the first position location sub-system operates solely upon the first object while the second position location sub-system operates solely upon the second object.
  • the first position location sub-system may locate multiple reference points on the object (or the player) for subsequent processing.
  • the second position location sub-system may locate multiple reference points on the second object (or player) for subsequent processing.
  • FIG. 73 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects.
  • the position location sub-system evaluates the physical environment within which the position location sub-system is deployed (Step 730 ).
  • the first position location sub-system then captures first position location information regarding the first object using a first position location technique (Step 732 ).
  • the second position location sub-system then captures the second position location information regarding a second object using second position location technique (Step 733 ).
  • the processing circuitry of the position location sub-system then processes the first position location information to determine a position of the first object within the coordinate system (Step 734 ).
  • the processing circuitry next processes the second position location information to determine a position of the second object within the coordinate system (Step 735 ).
  • Operation continues with the processing circuitry determining a motion of the second object using the first position location information (Step 736 ). Finally, operation concludes with the processing circuitry determining a motion of the first object using the second position location information (Step 737 ).
  • the operations of FIG. 73 use a cross position location technique on common objects. In such case, a first position location technique is used to locate an object while a second position location technique is used to detect motion of the object. In such case, even though only two position location sub-systems are included with the position location system, cross technique benefits are provided for multiple gaming objects tracking purposes.
  • the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences.
  • the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level.
  • an intervening item e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module
  • inferred coupling includes direct and indirect coupling between two items in the same manner as “coupled to.”
  • operble to indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items.
  • associated with includes direct and/or indirect coupling of separate items and/or one item being embedded within another item.
  • compares favorably indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2 , a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1 .

Abstract

A position location system includes a first position location sub-system, a second position location sub-system, and processing circuitry. The first position location sub-system determines first position location information regarding an object using a first position location technique. The second position location sub-system determines second position location information regarding the object, the second position location sub-system using a second position location technique that differs from the first position location technique. The processing circuitry processes the first position location information and the second position location information to determine position of the object within a coordinate system.

Description

  • This patent application claims priority under 35 USC § 119 to a provisionally filed patent application entitled POSITION AND MOTION TRACKING OF AN OBJECT, having a provisional filing date of Jun. 2, 2007, and a provisional Ser. No. of 60/936,724 (BP6471).
  • CROSS REFERENCE TO RELATED PATENTS
  • NOT APPLICABLE
  • The following U.S. Utility Applications are related to the present application and are incorporated herein by reference in their entirety:
  • 1. The U.S. Utility application Ser. No. 12/128,797, filed May 29, 2008, entitled LOCAL POSITIONING SYSTEM AND VIDEO GAME APPLICATIONS THEREOF, (BP7144);
  • 2. The U.S. Utility application Ser. No. 12/128,810, filed May 29, 2008, entitled APPARATUS FOR POSITION DETECTION USING MULTIPLE ANTENNAS, (BP7147);
  • 3. The U.S. Utility application Ser. No. 12/128,785, filed May 29, 2008, entitled APPARATUS FOR POSITION DETECTION USING MULTIPLE HCF TRANSMISSIONS, (BP7143);
  • 4. The U.S. Utility application Ser. No. 12/135,332, filed Jun. 9, 2008, entitled POSITION DETECTION AND/OR MOVEMENT TRACKING VIA IMAGE CAPTURE AND PROCESSING, (BP7149);
  • 5. The U.S. Utility application Ser. No. 12/135,341, filed Jun. 9, 2008, entitled DIRECTIONAL MICROPHONES FOR POSITION DETERMINATION, (BP7151);
  • 6. The U.S. Utility application Ser. No. 12/142,032, filed Jun. 19, 2008, entitled POSITIONING WITHIN A VIDEO GAMING ENVIRONMENT USING RF SIGNALS, (BP7145); and
  • 7. The U.S. Utility application Ser. No. 12/142,064, filed Jun. 19, 2008, entitled RFID BASED POSITIONING SYSTEM, (BP7148).
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • NOT APPLICABLE
  • INCORPORATION-BY-REFERENCE OF MATERIAL SUBMITTED ON A COMPACT DISC
  • NOT APPLICABLE
  • BACKGROUND OF THE INVENTION
  • 1. Technical Field of the Invention
  • This invention relates generally to position location systems and more particularly to determining position of one or more objects within a position location system.
  • 2. Description of Related Art
  • Communication systems are known to support wireless and wire lined communications between wireless and/or wire lined communication devices. Such communication systems range from national and/or international cellular telephone systems to the Internet to point-to-point in-home wireless networks to radio frequency identification (RFID) systems. Each type of communication system is constructed, and hence operates, in accordance with one or more communication standards. For instance, radio frequency (RF) wireless communication systems may operate in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof. As another example, infrared (IR) communication systems may operate in accordance with one or more standards including, but not limited to, IrDA (Infrared Data Association).
  • Depending on the type of RF wireless communication system, a wireless communication device, such as a cellular telephone, two-way radio, personal digital assistant (PDA), personal computer (PC), laptop computer, home entertainment equipment, RFID reader, RFID tag, et cetera communicates directly or indirectly with other wireless communication devices. For direct communications (also known as point-to-point communications), the participating wireless communication devices tune their receivers and transmitters to the same channel or channels (e.g., one of the plurality of radio frequency (RF) carriers of the wireless communication system) and communicate over that channel(s). For indirect wireless communications, each wireless communication device communicates directly with an associated base station (e.g., for cellular services) and/or an associated access point (e.g., for an in-home or in-building wireless network) via an assigned channel. To complete a communication connection between the wireless communication devices, the associated base stations and/or associated access points communicate with each other directly, via a system controller, via the public switch telephone network, via the Internet, and/or via some other wide area network.
  • For each RF wireless communication device to participate in wireless communications, it includes a built-in radio transceiver (i.e., receiver and transmitter) or is coupled to an associated radio transceiver (e.g., a station for in-home and/or in-building wireless communication networks, RF modem, etc.). As is known, the receiver is coupled to the antenna and includes a low noise amplifier, one or more intermediate frequency stages, a filtering stage, and a data recovery stage. The low noise amplifier receives inbound RF signals via the antenna and amplifies then. The one or more intermediate frequency stages mix the amplified RF signals with one or more local oscillations to convert the amplified RF signal into baseband signals or intermediate frequency (IF) signals. The filtering stage filters the baseband signals or the IF signals to attenuate unwanted out of band signals to produce filtered signals. The data recovery stage recovers raw data from the filtered signals in accordance with the particular wireless communication standard.
  • As is also known, the transmitter includes a data modulation stage, one or more intermediate frequency stages, and a power amplifier. The data modulation stage converts raw data into baseband signals in accordance with a particular wireless communication standard. The one or more intermediate frequency stages mix the baseband signals with one or more local oscillations to produce RF signals. The power amplifier amplifies the RF signals prior to transmission via an antenna.
  • In most applications, radio transceivers are implemented in one or more integrated circuits (ICs), which are inter-coupled via traces on a printed circuit board (PCB). The radio transceivers operate within licensed or unlicensed frequency spectrums. For example, wireless local area network (WLAN) transceivers communicate data within the unlicensed Industrial, Scientific, and Medical (ISM) frequency spectrum of 900 MHz, 2.4 GHz, and 5 GHz. While the ISM frequency spectrum is unlicensed there are restrictions on power, modulation techniques, and antenna gain.
  • In IR communication systems, an IR device includes a transmitter, a light emitting diode, a receiver, and a silicon photo diode. In operation, the transmitter modulates a signal, which drives the LED to emit infrared radiation which is focused by a lens into a narrow beam. The receiver, via the silicon photo diode, receives the narrow beam infrared radiation and converts it into an electric signal.
  • IR communications are used in video games to detect the direction in which a game controller is pointed. As an example, an IR sensor is placed near the game display, where the IR sensor detects the IR signal transmitted by the game controller. If the game controller is too far away, too close, or angled away from the IR sensor, the IR communication will fail.
  • Further advances in video gaming include three accelerometers in the game controller to detect motion by way of acceleration. The motion data is transmitted to the game console via a Bluetooth wireless link. The Bluetooth wireless link may also transmit the IR direction data to the game console and/or convey other data between the game controller and the game console.
  • While the above technologies allow video gaming to include motion sensing, it does so with limitations. As mentioned, the IR communication has a limited area in which a player can be for the IR communication to work properly. Further, the accelerometer only measures acceleration such that true one-to-one detection of motion is not achieved. Thus, the gaming motion is limited to a handful of directions (e.g., horizontal, vertical, and a few diagonal directions).
  • Therefore, a need exists for improved motion tracking and positioning determination for video gaming and other applications.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description of the Invention, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a gaming system in accordance with the present invention;
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system in accordance with the present invention;
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 4 is a schematic block diagram of a side view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 5 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 6 is a schematic block diagram of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 7 is a schematic block diagram of another embodiment of a gaming system in accordance with the present invention;
  • FIGS. 8-10 are diagrams of an embodiment of a coordinate system of a gaming system in accordance with the present invention;
  • FIGS. 11-13 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with the present invention;
  • FIG. 14 is a diagram of a method for determining position and/or motion tracking in accordance with the present invention;
  • FIGS. 15A and 15B are diagrams of other methods for determining position and/or motion tracking in accordance with the present invention;
  • FIGS. 16-18 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with the present invention;
  • FIGS. 19-21 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with the present invention;
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 23 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 24 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 25 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 26 is a diagram of another embodiment of a coordinate system of a gaming system in accordance with the present invention;
  • FIG. 27 is a schematic block diagram of an embodiment of a wireless communication system in accordance with the present invention;
  • FIG. 28 is a schematic block diagram of another embodiment of a wireless communication system in accordance with the present invention;
  • FIG. 29 is a schematic block diagram of another embodiment of a wireless communication system in accordance with the present invention;
  • FIG. 30 is a schematic block diagram of an overhead view of an embodiment of determining position and/or motion tracking in accordance with the present invention;
  • FIG. 31 is a schematic block diagram of a side view of an embodiment of determining position and/or motion tracking in accordance with the present invention;
  • FIG. 32 is a schematic block diagram of an embodiment of transceiver in accordance with the present invention;
  • FIG. 33 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 34 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 35 is a schematic block diagram of an embodiment of a wireless communication in accordance with the present invention;
  • FIG. 36 is a diagram of an embodiment of an antenna pattern in accordance with the present invention;
  • FIG. 37 is a diagram of another embodiment of an antenna pattern in accordance with the present invention;
  • FIG. 38 is a diagram of an example of receiving an RF signal in accordance with the present invention;
  • FIG. 39 is a diagram of an example of frequency dependent in-air attenuation in accordance with the present invention;
  • FIGS. 40 and 41 are diagrams of an example of frequency dependent distance calculation in accordance with the present invention;
  • FIG. 42 is a diagram of an example of constructive and destructive signaling in accordance with the present invention;
  • FIG. 43 is a diagram of another example of constructive and destructive signaling in accordance with the present invention;
  • FIG. 44 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 45 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 46 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 47 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 48 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 49 is a diagram of another method for determining position and/or motion tracking in accordance with the present invention;
  • FIG. 50 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 51 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 52 is a schematic block diagram of a side view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 53 is a schematic block diagram of an embodiment of an RFID reader and an RFID tag in accordance with the present invention;
  • FIG. 54 is a diagram of a method for determining position in accordance with the present invention;
  • FIG. 55 is a schematic block diagram of an embodiment of a gaming object in accordance with the present invention;
  • FIG. 56 is a schematic block diagram of an embodiment of three-dimensional antenna structure in accordance with the present invention;
  • FIG. 57 is a diagram of an example of an antenna radiation pattern in accordance with the present invention;
  • FIGS. 58 and 59 are diagrams of an example of frequency dependent motion calculation in accordance with the present invention;
  • FIG. 60 is a diagram of a method for determining motion in accordance with the present invention;
  • FIG. 61 is a diagram of an example of determining a motion vector in accordance with the present invention;
  • FIG. 62 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention;
  • FIG. 63 is a diagram of an example of audio and near audio frequency bands in accordance with the present invention;
  • FIG. 64 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 65 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with the present invention.
  • FIG. 66 is a schematic block diagram of an overhead view of yet another embodiment of a position location system in accordance with the present invention.
  • FIG. 67 is a flow chart illustrating operations of a position location system employing multiple position location techniques.
  • FIG. 68 is a flow chart illustrating usage of multiple position location techniques for locating an object.
  • FIG. 69 is a flow chart illustrating usage of multiple position location techniques for determining position and motion of an object.
  • FIG. 70 is a flow chart illustrating operation for using multiple position location techniques to determine position and orientation of an object.
  • FIG. 71 is a flow chart illustrating operation for using multiple position location techniques to determine positions of multiple objects.
  • FIG. 72 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects.
  • FIG. 73 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a video gaming system 10 that includes a game console device 12 and a gaming object 14 associated with a player 16. The video gaming system 10 is within a gaming environment 22, which may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 can be proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • In operation, the game console device 12 (embodiments of which will be described in greater detail with reference to FIGS. 2-7, 14-25, and 27-XX) determines the gaming environment 22. This may be done by sweeping the area with one or more signals within one or more frequency bands. For example, the one or more signals may be in the ultrasound frequency band of 20 KHz to 200 MHz, the radio frequency band of 30 HZ to 3 GHz, the microwave frequency band of 3 GHz to 300 GHz, the infrared (IR) frequency band of 300 GHz to 428 THz, the visible light frequency band of 428 THz to 750 THz (n×1012), the ultraviolet radiation frequency band of 750 THz to 30 PHz (n×1015), and/or the X-Ray frequency band of 30 PHz to 30 EHz (n×1018).
  • The determination of the gaming environment 22 continues with the gaming console device 12 measuring at least one of: reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects. The game console device 12 then identifies different objects based on the measured signal effects (e.g., inanimate objects have different reflective, absorption, pass through, and/or refractive properties of the one or more signals than animate beings).
  • The game console device 12 then determines distance of the different objects with respect to itself. From this data, the game console device 12 generates a three-dimensional topographic map of the area in which the video gaming system 10 resides to produce the gaming environment 22. In this example, the gaming environment 22 includes the player 16, the gaming object 14, a couch, a chair, a desk, the four encircling walls, the floor, and the ceiling.
  • Having determined the gaming environment, the game console device 12 maps the gaming environment 22 to a coordinate system (e.g., a three-dimensional Cartesian coordinate system [x, y, x], a spherical coordinate system [ρ, φ,θ], etc.). The game console device 12 then determines the position 18 of the player 16 and/or the gaming object 14 within the gaming environment in accordance with the coordinate system.
  • Once the gaming object's position is determined, the game console device 12 tracks the motion 20 of the player 16 and/or the gaming object 14. For example, the game console device 12 may determine the position 18 of the gaming object 14 and/or the player 16 within a positioning tolerance (e.g., within a meter) at a positioning update rate (e.g., once every second or once every few seconds) and tracks the motion 20 within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds).
  • During play of a video game, the game console device 12 receives a gaming object response regarding a video game function from the gaming object 14. The gaming object 14 may be a wireless game controller and/or any object used or worn by the player to facilitate play of a video game. For example, the gaming object 14 may be a simulated sword, a simulated gun, a helmet, a vest, a hat, shoes, socks, pants, shorts, gloves, etc.
  • The game console device 12 integrates the gaming object response and the motion 20 of the player and/or the gaming object 14 with the video game function. For example, if the video game function corresponds to a video tennis lesson (e.g., a ball machine feeding balls), the game console device 12 tracks the motion of the player 16 and the associated gaming object 14 (e.g., a simulated tennis racket) and maps the motion 20 with the feeding balls to emulate a real tennis lesson. The motion 20, which includes direction and velocity, enables the game console device 12 to determine how the tennis ball is being struck. Based on how it is being struck, the game console device 12 determines the ball's path and provides a video representation thereof.
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system 10 of FIG. 1 to illustrate that the position 18 and motion tracking 20 are done in three-dimensional space. Since the game console device 12 does three-dimensional positioning 18 and motion tracking 20, the distance and/or angle of the gaming object 14 and/or player 16 to the game console device 12 is a negligible factor. As such, the gaming system 10 provides accurate motion tracking of the gaming object 14 and/or player 16, which may be used to map the player's movements to a graphics image for true interactive video game play.
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes the game console device 12, the gaming objects 14-15, and one or more peripheral sensors 36-40. The game console device 12 includes a video display interface 34 (e.g., a video display driver, a video graphics accelerator, a video graphics engine, a video graphics array (VGA) card, etc.), a transceiver 32 (which may include a peripheral sensor), and a processing module 30. The processing module 30 may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module 30 may have an associated memory and/or memory element (not shown), which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processing module 30 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Further note that, the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64.
  • In operation, the transceiver 32 generates the one or more signals within one or more frequency bands for sweeping the area to facilitate the determination of the gaming environment. In addition, the transceiver 32 generates signals during video game play to facilitate the determination of the gaming objects' and/or the player's position 18 and generates signals to facilitate the determination of the gaming object's and/or the player's motion 20. For example, the transceiver 32 may utilize a first technique, which provides a first tolerance, (e.g., accuracy within a meter as may be obtained by a 2.4 GHz or 5 GHz localized positioning system as will be discussed with reference to FIGS. 6, 7, 27-29) to determine the position 18 of the player 16 and/or the gaming objects 14-15 and a second technique, which provides a second tolerance (e.g., accuracy within a few millimeters as may be obtained by a 60 GHz localized positioning system as will be discussed with reference to FIGS. 6, 7, 27-29 or a 60 GHz millimeter wave (MMW) radar system as will be discussed with reference to FIGS. 30-34).
  • The transceiver 32 receives responses (e.g., reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, a response to the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects), converts the responses to one or more digital signals, and provides the one or more digital signals to the processing module 30.
  • In an embodiment, the transceiver 32 may be an ultrasound transceiver that transmits one or more ultrasound signals within an ultrasound frequency band. The ultrasound transcevier receives at least one inbound ultrasound signal (e.g., reflection, refraction, echo, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 may be a radio frequency (RF) transceiver that transmits one or more signals within a radio frequency band. The RF transceiver receives at least one inbound RF signal (e.g., reflection, refraction, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is a microwave transceiver that transmits the one or more signals within a microwave frequency band. The microwave transceiver receives at least one inbound microwave signal (e.g., reflection, refraction, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is an infrared transceiver that transmits the one or more signals within an infrared frequency band. The infrared transceiver receives at least one inbound infrared signal (e.g., reflection, refraction, angle of incidence, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is a laser transceiver that transmits the one or more signals within a visible light frequency band. The laser transceiver, which may use fiber optics, receives at least one inbound visible light signal (e.g., reflection, refraction, response, backscatter, etc.) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is a digital camera that utilizes ambient light as the one or more signals within the visible light frequency band. The digital camera receives the at least one inbound visible light signal (e.g., reflection and/or refraction of light off the gaming environment, the player, and the gaming object) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is an ultraviolet transceiver that transmits the one or more signals within an ultraviolet radiation frequency band. The ultraviolet transceiver receives at least one inbound ultraviolet radiation signal (e.g., reflection, absorption, and/or refraction of UV light off the gaming environment, the player, and the gaming object) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is an X-ray transceiver that transmits the one or more signals within an X-ray frequency band. The X-ray transceiver receives at least one inbound X-ray signal (e.g., reflection, absorption, and/or refraction of UV light off the player and/or the gaming object) that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects.
  • In an embodiment, the transceiver 32 is a magnetic source that transmits the one or more signals as one or more magnetic signals. The magnetic source receives at least one inbound magnetic field that facilitates the measuring of the at least one of: the reflection of the one or more signals, the absorption of the one or more signals, refraction of the one or more signals, the pass through of the one or more signals, the angle of incident of the one or more signals, the backscattering of the one or more signals, and the magnetization induced by the one or more signals to produce measured signal effects. For instance, the magnetic source may include three coils to generate magnetic gradients in the x, y and z directions of the magnetic source. The coils may be powered by amplifiers that enable rapid and precise adjustments of the coil's field strength and direction.
  • In an embodiment, the transcevier 32 may include one or more of the ultrasound transceiver, the RF transceiver, the microwave transceiver, the infrared transceiver, the laser transceiver, the digital camera, the ultraviolet transceiver, the X-ray transceiver, and the magnetic source transceiver.
  • The processing module 30 receives the one or more digital signals from the transceiver 32 and processes them to determine the gaming environment 22, the position 18 of the player 16 and/or the gaming objects 14-15, and the motion 20 of the player 16 and/or the gaming object 14. Such processing includes one or more of determining reflection of the one or more signals, determining the absorption of the one or more signals, determining refraction of the one or more signals, determining the pass through of the one or more signals, determining the angle of incident of the one or more signals, interpreting the backscattering of the one or more signals, interpreting a signal response, and determining the magnetization induced by the one or more signals. The process further includes identifying objects, players, and gaming objects based on the preceding determinations and/or interpretations.
  • The one or more peripheral sensors 36-40, which may be a ultrasound transceiver, the RF transceiver, the microwave transceiver, the infrared transceiver, the laser transceiver, the digital camera, the ultraviolet transceiver, the X-ray transceiver, the magnetic source transceiver, an access point, a local positioning system transmitter, a local positioning system receiver, etc., transmits one or more signals and receives responses thereto that facilitate the determination of the player's and/or gaming object's position 18 and/or motion 20. The peripheral sensors 36-40 may be enabled at the same time using different frequencies, different time slots, time-space encoding, frequency-spacing encoding, or may enabled at different times in a round robin, poling, or token passing manner.
  • In the example of FIG. 3, the player 16 is using two or more video gaming objects 14-15 to play the video game. In this instance, the game console device 12, alone or with data provided by one or more of the peripheral sensors 36-40, determines position of the player 16, a first associated gaming object 14, and a second associated gaming object 15 within the gaming environment 22 in accordance with the coordinate system. The game console device 12 tracks the motion of the player 16, the motion of the first associated gaming object 14, and the motion of the second associated gaming object 15.
  • The game console device 12 receives a first gaming object response regarding the video game function from the first associated gaming object 14 and a second gaming object response regarding the video game function from the second associated gaming object 15. The game console device 12 integrates the first gaming object response, the second gaming object response, the motion of the first player, the motion of the second player, the motion of the first associated gaming object, and the motion of the second associated gaming object with the video game function.
  • While the preceding discussion has focused on a video game system, the concepts of position and motion tracking are applicable for a wide variety of applications. For example, the position and motion tracking apparatus may be used for home security, baby monitoring, store security, shop-lifting detection, concealed weapon detection, etc. Such an apparatus includes a transceiver section and a processing module. The transceiver section transmits one or more signals within one or more frequency bands in a given area. The one or more signals may be in the ultrasound frequency band of 20 KHz to 200 MHz, the radio frequency band of 30 HZ to 3 GHz, the microwave frequency band of 3 GHz to 300 GHz, the infrared (IR) frequency band of 300 GHz to 428 THz, the visible light frequency band of 428 THz to 750 THz (n×1012), the ultraviolet radiation frequency band of 750 THz to 30 PHz (n×1015), and/or the X-Ray frequency band of 30 PHz to 30 EHz (n×1018).
  • The transceiver section determines a response to the one or more signals (e.g., an inbound ultrasound signal, an inbound RF signal, an inbound microwave signal, an inbound IR signal, an inbound visible light signal, an inbound ultraviolet light signal, an inbound X-ray signal, and/or an inbound magnetic field). The transceiver section converts the response into a digital response signal.
  • The processing module processes the digital response signal to determine a measure of at least one of: reflection of the one or more signals, absorption of the one or more signals, refraction of the one or more signals, pass through of the one or more signals, angle of incident of the one or more signals, backscattering of the one or more signals, and magnetization induced by the one or more signals to produce measured signal effects. The apparatus then identifies different objects based on the measured signal effects (e.g., inanimate objects have different reflective, absorption, pass through, and/or refractive properties of the one or more signals than animate beings).
  • The processing module then determines distance of the different objects with respect to itself. From this data, the apparatus generates a three-dimensional topographic map of the area to produce a digital representation of the environment. The apparatus then maps the environment to a coordinate system (e.g., a three-dimensional Cartesian coordinate system [x, y, x], a spherical coordinate system [ρ, φ, θ], etc.) and determines the position of an object or person within the environment in accordance with the coordinate system.
  • Once the position is determined, the processing module tracks the motion of the object or person. For motion tracking, the transceiver section receives responses that provide millimeter accuracy of the object and or person (e.g., 60 GHz signals, light, etc.) and converts the responses to digital signals. The processing module processes the digital signals with respect to the environment and the object or person to track motion.
  • FIG. 4 is a schematic block diagram of a side view of another embodiment of a gaming system 10 that includes one or more gaming objects 14-15, the player 16, the game console device 12, and one or more sensing tags 44 proximal to the player 16 and/or to the gaming object 14-15. The one or more sensing tags 44 may be a metal patch, an RFID tag, a light reflective material, a light absorbent material, a specific RGB [red, green, blue] color, a 60 GHz transceiver, and/or any other component, material, and/or texture that assists the game console device 12 in determining the position and/or motion of the player 16 and/or the gaming object 14-15. For example, the metal patch will reflect RF and/or microwave signals at various angles depending on the position of the metal patch with respect to the game console device 12. The game console device 12 utilizes the various angles to determine the position and/or motion of the player 16 and/or the gaming object 14-15.
  • As another example, the gaming objects 14-15 may include a game controller that is held by the player and may further include a helmet, a shirt, pants, gloves, and/or socks, which are worn by the player. Each of the gaming objects 14-15 includes one or more sensing tags 44, which facilitate the determining of the position 18 and/or motion 20. An example of a gaming system 10 using RFID tags will be discussed with reference to FIGS. 51-54.
  • FIG. 5 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a plurality of players 16, 50 and a plurality of gaming objects 14, 52. In this system, the game console device 12 determines the position 18 of the first player 16 and/or the associated gaming object 14 within the gaming environment 22 in accordance with the coordinate system. The game console device 12 also determines the position 54 of the second player 50 and/or the associated gaming object 52 within the gaming environment in accordance with the coordinate system.
  • The game console device 12 separately tracks the motion 20 of the first player 16, the motion 20 of the first associated gaming object 14, the motion 56 of the second player 50, and the motion 56 of the second associated gaming object 52. While tracking the motion of the players and/or gaming objects, the game console may receive a gaming object response regarding the video game function from the first and/or the second associated gaming object 14, 52.
  • The game console device 12 integrates the first and/or second gaming object response, the motion of the first player, the motion of the second player, the motion of the first associated gaming object, and the motion of the second associated gaming object with the video game function. While the present example shows two players and associated gaming objects, more than two players and associated gaming objects could be in the gaming environment. In this instance, the game console device 12 separately determines the position and the motion of the players and the associated gaming objects as previously discussed and integrates their play in the video gaming graphics being displayed.
  • FIG. 6 is a schematic block diagram of another embodiment of a gaming system 10 that includes a game console device 12, a plurality of localized position system (LPS) transmitters 60-64, at least one gaming object 14, and an LPS receiver 66 associated with the gaming object 14. The LPS receiver 66 and the gaming object 14 may be separate devices or an integrated device. For example, the LPS receiver 66 may be a packaged printed circuit board (PCB) that includes an integrated circuit (IC) LPS receiver and the gaming object 14 is a game controller, where the packaged PCB is attachable to the game controller. As another example, the gaming object 14 and the LPS receiver 66 may be integrated in a device, such as a cell phone, a game controller, a personal digital assistant, a handheld computing unit, etc.
  • Each LPS transmitter 60-64 includes an accurate clock (e.g., an atomic clock) or is coupled to an accurate clock source (e.g., has a global positioning system (GPS) receiver) to provide an accurate time standard available for synchronization at any point in the physical area. Each LPS transmitter 60-64 transmits a spread spectrum signal containing a BPSK (Bi-Phase Switched keyed) signal in which 1's & 0's are represented by reversal of the phase of the carrier. This message is transmitted at a specific frequency at a “chipping rate” of x bits per second (e.g., 50 bits per millisecond). The message may repeat every 30 milliseconds (or more frequently) and may be referred as a local C/A signal (Coarse Acquisition signal). This message contains information regarding the entire LPS and information regarding the LPS transmitter sending the local C/A signal.
  • The LPS receiver 66 utilizes the local C/A signals to determine its position within a given coordinate system (See FIGS. 8-14, 16-21). In particular, the LPS receiver 66 determines a time delay for at least some of the plurality of local C/A signals in accordance with the at least one clock signal. The LPS receiver 66 calculates distance (e.g., d1, d2, and d3) to the LPS transmitters 60-64 based on the time delays for at least some of the plurality of C/A signals. In other words, for each LPS RF signal received, which is received from different LPS transmitters 60-64, the LPS receiver 66 calculates a time delay with respect to the corresponding LPS transmitter. For instance, the LPS receiver 66 identifies each LPS transmitter's 60-64 signals by their distinct C/A code pattern, and then measures the time delay for each LPS transmitter. To do this, the receiver 66 produces an identical C/A sequence using the same seed number as the LPS transmitter. By lining up the two sequences, the receiver can measure the delay and calculate the distance to the LPS transmitter 60-64.
  • The LPS receiver 66 then calculates the position of the corresponding plurality of LPS transmitters based on the local C/A signals. For example, the LPS receiver 66 uses the position data of the local C/A signals to calculate the LPS transmitter's position. The LPS receiver then determines its location based on the distance of the corresponding plurality of LPS transmitters and the position of the corresponding plurality of LPS transmitters 60-64. For instance, by knowing the position and the distance of an LPS transmitter, the LPS receiver can determine it's location to be somewhere on the surface of an imaginary sphere centered on that LPS transmitter and whose radius is the distance to it. When four LPS transmitters 60-64 are measured simultaneously, the intersection of the four imaginary spheres reveals the location of the receiver. Often, these spheres will overlap slightly instead of meeting at one point, so the receiver will yield a mathematically most-probable position (and often indicate the uncertainty).
  • The LPS receiver 66, via the gaming object 14, transmits its position within the coordinate system to the game console device 12. Alternatively, the LPS receiver 66, via the gaming object 14, may provide the LPS transmitter distances (e.g., d1, d2, and d3) to the game console device 12 such that the game console device 12 can determine the position of the gaming object within the gaming environment. Depending on the frequency of transmitting the C/A signals, the accuracy of the clocks, and carrier frequency of the signals, the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may also be used to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • FIG. 7 is a schematic block diagram of another embodiment of a gaming system 10 that includes a game console device 12, at least one gaming object 14, a player 16, a local positioning system (LPS) transmitter 74, and a plurality of LPS receivers 68-72. The LPS transmitter 74 and the gaming object 14 may be separate devices or an integrated device. For example, the LPS transmitter 74 may be a packaged printed circuit board (PCB) that includes an integrated circuit (IC) LPS transmitter and the gaming object 14 is a game controller, where the packaged PCB is attachable to the game controller. As another example, the gaming object 14 and the LPS transmitter 74 may be integrated in a device, such as a cell phone, a game controller, a personal digital assistant, a handheld computing unit, etc.
  • The LPS transmitter 74 includes an accurate clock and transmits a narrow pulse (e.g., pulse width less than 1 nano second) at a desired rate (e.g., once every milli second to once every few seconds). The narrow pulse signal includes a time stamp of when it is transmitted.
  • The LPS receivers 68-72 receive the narrow pulse signal and determine their respective distances (e.g., d1, d2, and d3) to the LPS transmitter 74. In particular, an LPS receiver 68-72 determines the distance to the LPS transmitter 74 based on the time stamp and the time at which the LPS receiver received the signal. Since the narrow pulse travels at the speed of light, the distance can be readily determined.
  • The plurality of distances between the LPS receivers 68-72 and the LPS transmitter 74 are then processed (e.g., by the game console device 12 or by a master LOS receiver) to determine the position of the LPS transmitter 74 within the local physical area in accordance with the known positioning of the LPS receivers 68-72. For instance, with the known position of an LPS receiver and its distance to the LPS transmitter 74, the LPS receiver (the game console device or a master LPS receiver) can determine the LPS transmitter's location to be somewhere on the surface of an imaginary sphere centered on the LPS receiver and whose radius is the distance to it. When the distance to four LPS receivers is known, the intersection of the four imaginary spheres reveals the location of the LPS transmitter 74.
  • The processing of the LPS receiver to transmitter distances may be performed by a master LPS receiver, by the game console device 12, by a motion tracking processing module, and/or by an LPS computer coupled to the plurality of LPS receivers. The motion tracking processing module may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Further note that, the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64.
  • Depending on the frequency of transmitting the pulse signals, the accuracy of the clocks, and carrier frequency of the signals, the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may also be used to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • With respect to FIGS. 6 and 7, an LPS system may include both a plurality of LPS transmitters as in FIG. 6 and a plurality of LPS receivers as in FIG. 7, where the LPS device on the person includes both the LPS receiver of FIG. 6 and the LPS transmitter of FIG. 7. Note the LPS transmitters of the FIG. 6 and the LPS transmitters of FIG. 7 may be stand-alone devices positioned through a localized physical area (e.g., a home, an office, a building, etc.) or may be included within a device that positioned through the localized physical area. For example, the LPS transmitters of FIG. 6 and the LPS receivers of FIG. 7 may be included in access points of a WLAN, may be included in smoke detectors, motion detectors of a security system, speakers of an intercom system, light fixtures, light bulbs, electronic equipment (e.g., computers, TVs, radios, clocks, etc.), and/or any device or object found or used in a localized physical area.
  • FIGS. 8-10 are diagrams of an embodiment of a three-dimensional Cartesian coordinate system of a localized physical area that may be used for a gaming system 10. In these figures an x-y-z origin is selected to be somewhere in the localized physical area and the position and motion of the player 16 and/or the gaming object 14 is determined with respect to the origin (e.g., 0, 0, 0). For example, a point (e.g., x1, y1, z1) on the player is used to identify its position in the gaming environment and a point (e.g., x2, y2, z2) on the gaming object 14 is used to identify its position in the gaming environment.
  • As the player and/or gaming object move, its new position is identified within the gaming 25 environment and the relation between the old point and the new point is used to determine three-dimensional motion.
  • FIGS. 11-13 are diagrams of an embodiment of a spherical coordinate system of a localized physical area that may be used for a gaming system 10. In these figures an 30 origin is selected to be somewhere in the localized physical area and the position and motion of the player 16 and/or the gaming object 14 is determined with respect to the origin. For example, the position of the player may be represented as vector, or spherical coordinates, (ρ, φ, θ), where ρ≧0 and is the distance from the origin to a given point P; 0 ≦φ≦180° and is the angle between the positive z-axis and the line formed between the origin and P; and 0≦θ≦360° and is the angle between the positive x-axis and the line from the origin to P projected onto the xy-plane. In general, φ is referred to as the zenith, colatitude or polar angle, θ is referred to as the azimuth.φ and θ lose significance when ρ=0 and θ loses significance when sin(φ)=0 (at φ=0 and φ=180°). A point is plotted from its spherical coordinates, by going p units from the origin along the positive z-axis, rotate φ about the y-axis in the direction of the positive x-axis and rotate θ about the z-axis in the direction of the positive y-axis.
  • For example, a point (e.g., ρ1 φ1, θ1) on the player is used to identify its position in the gaming environment and a point (e.g., ρ2, φ2, θ2) on the gaming object 14 is used to identify its position in the gaming environment. As the player and/or gaming object move, its new position is identified within the gaming environment and the relation between the old point and the new point is used to determine three-dimensional motion. While FIGS. 8-13 illustrate two types of coordinate system, any three-dimensional coordinate system may be used for tracking motion and/or establishing position within a gaming system.
  • FIG. 14 is a diagram of a method for determining position and/or motion tracking that begins at step 80 where the game console device determines the gaming environment 22 (e.g., determining the properties of the localized physical area such as height, width, depth, objects in the physical area, etc.). The method then continues at step 82 where the game console device maps the gaming environment to a coordinate system (e.g., Cartesian coordinate system of FIGS. 8-10 or spherical system of FIGS. 11-13). The method continues at step 84 where the game console device determines position of the player and/or the gaming object within the gaming environment in accordance with the coordinate system.
  • The method continues at step 86 where the game console device tracks the motion of the player and/or the gaming object. In a system that includes two or more players, the game console device separately determines the players' position and separately tracks their motion. In a system where a player has two or more gaming objects, the game console device separately determines the gaming objects' position and separately tracks their motion. In a system that includes multiple players and at least one player has multiple gaming objects, the game console device separately determines the players' position, separately tracks their motion, separately determines the gaming objects' position and separately tracks the gaming objects' motion. With respect to motion tracking, an object moving at 200 miles per hour (mph) moves 0.1 millimeters per millisecond; thus determining a new position every 10 milliseconds (mS) provides about 1 millimeter accuracy for objects moving at 200 mph. As such, the game console device may determine the new position of the player and/or gaming object every 10 mS and use the old and new positions to determine the motion of the player and/or gaming object.
  • The method continues at step 88 where the game console device receives a gaming object response regarding a video game function from a gaming object. The method continues at step 90 where the game console device integrates the gaming object response and the motion of the at least one of the player and the gaming object with the video game function. If the system includes multiple players and/or multiple gaming objects, the game console device 12 integrates their motion into the video game graphics being displayed. If the game console device receives multiple gaming object responses, the game console device integrates them into the video game graphics being displayed.
  • FIG. 15A is a diagram of another method for determining position and/or motion tracking that begins at step 100 where an origin of a Cartesian coordinate system (e.g., the coordinate system of FIGS. 8-10) is determined. The origin may be any other point within the localized physical area of the gaming environment (e.g., a point on the game console device). The method continues in one or more branches. At step 106, the initial coordinates of the player are determined using one or more of a plurality of position determining techniques as described herein. This branch continues at step 108 by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • The other branch begins at step 102 where the coordinates of the gaming object's initial position are determined using one or more of a plurality of position determining techniques as described herein. This branch continues at step 104 by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein. Note that the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 1 millisecond provides 0.1 mm accuracy in motion tracking.
  • FIG. 15B is a diagram of another method for determining position and/or motion tracking that begins at step 110 by determining a reference point within a coordinate system (e.g., the vector coordinate system of FIGS. 11-13). The reference point may be the origin or any other point within the localized physical area. The method continues in one or more branches. At step 116, a vector with respect to the reference point is determined to indicate the player's initial position, which may be done by using one or more of a plurality of position determining techniques as described herein. This branch continues at step 118 by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • The other branch begins at step 112 by determining a vector with respect to the reference point for the gaming object to establish its initial position, which may be done by using one or more of a plurality of position determining techniques as described herein. This branch continues at step 114 by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein. Note that the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 1 millisecond provides 0.1 mm accuracy in motion tracking.
  • FIGS. 16-18 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system 10. In these figures an xyz origin is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its Cartesian coordinates with respect to the origin. As a point moves from one position (e.g., x0, y0, z0) to a new position (e.g., x1, y1, z1), the movement is tracked based on the two positions (e.g., Δx=x0-x1, Δy=y0-y1, Δz=z0-z1). Note that the player and the gaming object may each have several points that are tracked and used to determine position and motion.
  • The positioning and motion tracking of the player (i.e., one or more points on the player) and/or gaming object (i.e., one or more points on the gaming object) may be done with respect to the origin or with respect to each other. For instance, the gaming object's position and motion may be determined with reference to the origin and the position and motion of the player may be determined with reference to the position and motion of the gaming object. Alternatively, the player's position and motion may be determined with reference to the origin and the position motion of the gaming object may be determined with reference to the player's potion and motion.
  • FIGS. 19-21 are diagrams of an embodiment of a spherical coordinate system of a localized physical area that may be used for a gaming system 10. In these figures an origin, or reference point, is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its spherical coordinates with respect to the origin. As a point moves from one position (e.g., ρ0, φ0, θ0) to a new position (e.g., ρ1, ρ1, θ1), the movement is tracked based on the two positions (e.g., ΔV=V0-V1 or Δρ=ρ0-ρ1, Δφ=φ0-φ1, Δθ=θ0-θ1). Note that the player and the gaming object may each have several points that are tracked and used to determine position and motion.
  • The positioning and motion tracking of the player (i.e., one or more points on the player) and/or gaming object (i.e., one or more points on the gaming object) may be done with respect to the origin of the spherical coordinate system or with respect to each other. For instance, the gaming object's position and motion may be determined with reference to the origin and the position and motion of the player may be determined with reference to the position and motion of the gaming object. Alternatively, the player's position and motion may be determined with reference to the origin and the position motion of the gaming object may be determined with reference to the player's potion and motion.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking that begins at step 120 by determining environment parameters (e.g., the gaming environment) of the physical area in which the gaming object resides and/or in which the game system resides. The environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • The method continues at step 122 by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 8-13). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room. In addition, inanimate objects in the room (e.g., a couch, a chair, etc.) are mapped to the coordinate system based on their physical location in the room.
  • The method continues at step 124 by determining the coordinates of the player's, or players', position in the physical area. The method continues at step 126 by determining the coordinates of a gaming object's initial position. Note that the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is in close proximity to the player. Alternatively, the initial position of the player may be used to determine the initial position of the gaming object. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method continues at step 128 by updating the coordinates of the player's, or players', position in the physical area to track the player's, or players', motion. The method continues at step 130 by updating the coordinates of a gaming object's position to track its motion. Note that the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is in close proximity to the player. Alternatively, the motion of the player may be used to determine the motion of the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • In another embodiment, the method of FIG. 22 may be performed by the game console device that begins with determining at least one positioning coordinate for the player with respect to an origin of the coordinate system and determining at least one positioning coordinate for the gaming object with respect to the at least one positioning coordinate for the player. The method continues with the game console device determining at least one next positioning coordinate for the player with respect to the origin and determining at least one next positioning coordinate for the gaming object with respect to the at least one next positioning coordinate for the player. The method continues with the game console device determining the motion of the player, with respect to the origin, based on the at least one positioning coordinate for the player and the at the least one next positioning coordinate for the player. The method also includes the game console device determining the motion of the gaming object, with respect to the player, based on the at least one positioning coordinate for the gaming object and the at the least one next positioning coordinate for the gaming object.
  • FIG. 23 is a diagram of another method for determining position and/or motion tracking that begins at step 140 by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then continues at step 142 by determining a vector for a player's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 11-13). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • The method continues at step 144 by determining a vector of a gaming object's initial position. Note that the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is close proximity to the player. Alternatively, the initial position of the player may be used to determine the initial position of the gaming object. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method continues at step 146 by updating the vector of the player's, or players', position in the physical area to track the player's motion. The method continues at step 148 by updating the vector of the gaming object's position to track its motion. Note that the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is close proximity to the player. Alternatively, the motion of the player may be used to determine the motion of the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 24 is a diagram of another method for determining position and/or motion tracking that begins at step 150 by determining environment parameters of the physical area in which the gaming object lays and/or in which the game system lays. The environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • The method continues at step 152 by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 16-18). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room. In addition, objects in the room (e.g., a couch, a chair, etc.) are mapped to the coordinate system based on their physical location in the room.
  • The method continues at step 154 by determining the coordinates of the gaming object's initial position in the physical area. The method continues at step 156 by determining the coordinates of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method continues at step 156 by updating the coordinates of the gaming object's position in the physical area to track its motion. The method continues at step 158 by updating the coordinates of the player's position to track the player's motion with respect to the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • In another embodiment, the method of FIG. 24 may be performed by the game console device that begins with determining at least one positioning coordinate for the gaming object with respect to an origin of the coordinate system and determining at least one positioning coordinate for the player with respect to the at least one positioning coordinate for the gaming object. The method continues with the game console device determining at least one next positioning coordinate for the gaming object with respect to the origin and determining at least one next positioning coordinate for the player with respect to the at least one next positioning coordinate for the gaming object.
  • The method continues with the game console device determining the motion of the gaming object, with respect to the origin, based on the at least one positioning coordinate for the gaming object and the at the least one next positioning coordinate for the gaming object. The method continues with the game console device determining the motion of the player, with respect to the gaming object, based on the at least one positioning coordinate for the player and the at the least one next positioning coordinate for the player.
  • FIG. 25 is a diagram of another method for determining position and/or motion tracking that begins at step 162 by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method continues at step 164 by determining a vector for a gaming object's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 19-21). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • The method continues at step 166 by determining a vector of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method continues at step 168 by updating the vector of the gaming object's position in the physical area to track its motion. The method continues at step 70 by updating the vector of the player's position with respect to the gaming object's motion to track the player's motion. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 26 is a diagram of another embodiment of a coordinate system of a gaming system that is an extension of the coordinate systems discussed above. In this embodiment, the coordinate system includes a positioning coordinate grid 172 and a motion tracking grid 174, where the motion tracking grid 174 has a finer resolution than the positioning coordinate grid 172. For example, the player and/or gaming object may be positioned anywhere within the gaming environment at a given time, but, for a given time interval (e.g., 1 second), the player's and/or gaming object's position will be relatively fixed. However, within this relative stationary position, the player and/or gaming object may move (e.g., a head bob, slash of the gaming object, turn sideways, etc.) during the given time interval. Thus, the low resolution (e.g., within a meter) of the positioning coordinate grid 172 can be adequately used to establish the player's and/or gaming object's relatively stationary positions for the given time interval. Within the given time interval, the finer resolution (e.g., within a few millimeters) of the motion tracking grid 174 of is used at a higher interval rate (e.g., once every 10 mS) to accurately track the motion of the player and/or game object. Note that, once the relatively stationary position of the player and/or gaming object for the given time period is established, the motion tracking can be focused to the immediate area of the relatively stationary position.
  • FIG. 27 is a schematic block diagram of an embodiment of a wireless communication system that includes a plurality of access points 180-184, a gaming console device 12, a gaming object 14, a device 186, and a local positioning system (LPS) receiver 66. The LPS receiver 66 is associated with the gaming object 14 and/or with the player 16. The game console device 12 is coupled to the plurality of access points (AP) 180-184 and to at least one wide area network (WAN) connection (e.g., digital subscriber loop (DSL) connection, cable modem, satellite connection, etc.). In this manner, the game console device 12 may function as the bridge, or hub, for the WLAN to the outside world.
  • The access points 180-184 are positioned throughout a given area to provide a seamless WLAN for the given area (e.g., a house, an apartment building, an office building, etc.). The device 186 may be any wireless communication device that includes circuitry to communicate with a WLAN. For example, the device may be a cell phone, a computer, a laptop, a PDA, a cordless phone, etc.
  • In addition, each access point 180-184 includes an accurate clock (e.g., an atomic clock) or is coupled to an accurate clock source to provide an accurate time standard for synchronization at any point in the physical area. Each AP transmits a spread spectrum signal (s1) containing a BPSK (Bi-Phase Switched keyed) signal in which 1's & 0's are represented by reversal of the phase of the carrier or a signal having some other format (e.g., FM, AM, QAM, QPSK, ASK, FSK, MSK). This message is transmitted at a specific frequency at a “chipping rate” of x bits per second (e.g., 50 bits per second). The signal may repeat every 10-30 millisecond (or longer duration) and it contains information regarding the entire LPS and information regarding the AP transmitting the signal. Alternatively, the signal may be a very narrow pulse (e.g., less than 1 nanosecond), repeated at a desired rate (e.g., 1-100 KHz).
  • The LPS receiver 66 utilizes the signals to determine its position within a given coordinate system (See FIGS. 8-14, 16-21). For instance, the LPS receiver 66 determines a time delay (e.g., t1, t2, and t3) for at least some of the plurality of signals in accordance with the at least one clock signal. The LPS receiver 66 calculates a distance to a corresponding one of the plurality of APs based on the time delays of the signals (s1). In other words, for each signal received, which is received from different APs, the LPS receiver 66 is calculating a time delay of the signal (s1) received from the APs, or a subset thereof, (e.g., at a minimum three and preferably four) to triangulate its position in three-dimensional space. For instance, the LPS receiver 66 identifies each AP signal by its distinct code pattern, and then measures the time delay for each AP. To do this, the receiver 66 produces an identical signal sequence using the same seed number as the AP. By lining up the two sequences, the receiver 66 can measure the delay and calculate the distance to the AP.
  • The LPS receiver 66 then determines the position of the corresponding plurality of APs based on the signals. For example, the LPS receiver 66 uses the position data of the signals to determine the APs' position. The LPS receiver 66 then determines its location based on the distance to the APs and the position of the APs. For instance, by knowing the position and the distance of an AP, the LPS receiver 66 can determine it's location to be somewhere on the surface of an imaginary sphere centered on that AP and whose radius is the distance to it. When four APs are measured simultaneously, the intersection of the four imaginary spheres reveals the location of the receiver. Often, these spheres will overlap slightly instead of meeting at one point, so the receiver will yield a mathematically most-probable position (and often indicate the uncertainty).
  • Depending on the frequency of transmitting the signal (s1), the accuracy of the APs' clocks, and the carrier frequency of the signal, the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may be used to determine the relative position and to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • FIG. 28 is a schematic block diagram of another embodiment of a wireless communication system that includes a plurality of access points 180-184, a gaming console device 12, a gaming object 14, and the device 186. The gaming object 14 and/or the player 16 may have associated therewith a local positioning system (LPS) transmitter 74. The game console device 12 is coupled to the plurality of access points (AP) 180-184, which are positioned throughout a given area to provide a seamless WLAN for the given area (e.g., a house, an apartment building, an office building, etc.). In addition, the game console device 12 is coupled to at least one wide area network (WAN) connection (e.g., DSL connection, cable modem, satellite connection, etc.). In this manner, the game console device may function as the bridge, or hub, for the WLAN to the outside world.
  • The LPS transmitter 74 includes an accurate clock and transmits a narrow pulse (e.g., pulse width less than 1 nano second) at a desired rate (e.g., once every milli second to once every few seconds). The narrow pulse signal includes a time stamp of when it is transmitted.
  • The APs 180-184 receive the narrow pulse signal and determine their respective distances to the LPS transmitter 74. In particular, an AP determines the distance to the LPS transmitter 74 based on the time stamp and the time at which the AP received the signal. Since the narrow pulse travels at the speed of light, the distance can be readily determined.
  • The plurality of distances between the APs 180-184 and the LPS transmitter 74 are then processed to determine the position of the LPS transmitter 74 within the local physical area in accordance with the known positioning of the APs. For instance, with the known position and the distance of an AP to the LPS transmitter 74, an AP can determine the LPS transmitter's location to be somewhere on the surface of an imaginary sphere centered on that AP and whose radius is the distance to it. When the distance to four APs is known, the intersection of the four imaginary spheres reveals the location of the LPS transmitter.
  • The processing of the AP to transmitter 74 distances may be performed by a master AP, by the game console device 12, by a motion tracking processing module, and/or by an LPS computer coupled to the plurality of APs 180-184. The motion tracking processing module may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Further note that, the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64.
  • Depending on the frequency of transmitting the signal (s1), the accuracy of the APs' clocks, and the carrier frequency of the signal, the accuracy of the gaming object's position may be within a few millimeters to about a meter. If the accuracy is the former, then this arrangement may be used to determine the relative position and to track the motion of the player and/or gaming object. If the accuracy is the latter, then this arrangement may be used to determine the player's and/or gaming object's position and another scheme would be used to track their motion.
  • FIG. 29 is a schematic block diagram of another embodiment of a wireless communication system that includes a plurality of LAN devices 912-196, a WAN coupling device 190, a game console device 12, a gaming object 14, and a player 16. Each of the LAN devices 192-196, which may be a wired device (e.g., includes an Ethernet network card, a fire wire interface, etc.) or a wireless device, includes an LPS module 198-202 and the gaming object 14 and/or the player 16 has associated therewith an LPS personal module 205. In one embodiment, the LPS modules 198-202 include an LPS transmitter 60-64 and the LPS personal module 205 includes an LPS receiver 66 as described with reference to FIG. 6.
  • In another embodiment, the LPS modules 198-202 include an LPS receiver 68-72 and the LPS personal module 205 includes an LPS transmitter 74. Note that the WAN coupling device 190 may be a cable modem, a DSL modem, a satellite receiver, a cable receiver, and/or any other device that provides a WAN connection 206 to a WAN network (e.g., the internet, a public phone system, a private network, etc.).
  • FIGS. 30 and 31 are top and side view diagrams of an embodiment of determining position and/or motion tracking using RF and/or microwave signaling. In this embodiment, a transceiver 32 (which may be included in the game console device, coupled to a game console, coupled to a remote game console, or coupled to a server via an WAN connection) transmits a plurality of beamformed signals at one or more frequencies (e.g., frequencies in the ISM band, 29 MHz, 60 MHz, above 60 GHz, and/or other millimeter wavelengths (MMW)) to sweep the physical area. For each signal 210 transmitted, the transceiver 32 determines the reflected signal 212 energy and may also determine the refracted signal 216energy. The transceiver 32 may also determine the pass through signal 214 component. Since different objects reflect, refract, and/or pass through RF to MMW signals in different ways, the game console device 12 can identify an object based on the reflected, refracted, and/or pass through signal energies. For example, human beings reflect, refract, and/or pass through RF and MMW signals in a different way than inanimate objects such as furniture, walls, plastics, metals, clothing, etc.
  • In this manner, a three dimension image of the physical area is obtained. Further analysis of the reflected, pass through, and/or refracted signals yields the distance to the transceiver 32. From the distance for a plurality of beamformed signals, the position of the objects (including the player and the gaming object) may be determined. Note that more than one transceiver may be used to determine the three-dimensional image of the physical area and/or to determine positioning and/or motion tracking within the physical area. A paper entitled, “Public Security Screening for Metallic Objects with Millimeter Wave Images”, Imaging for Crime Detection and Prevention, 2005. ICDP 2005. The IEE International Symposium on Page(s): 1-4, 7-8 June 2005, discusses basic elements of MMW imaging, which is incorporated herein by reference. Beamforming is discussed in a patent application entitled, “BEAMFORMING AND/OR MIMO RF FRONT-END AND APPLICATIONS THEREOF,” having a Ser. No. of 11/527,961, and a filing date of Sep. 27, 2006, which is incorporated herein by reference.
  • In addition to determining position of objects, the transcevier 32 using MMW signaling can track the motion of the player and/or gaming object. With WWM signaling, the wavelength of a 60 GHz signal is approximately 5 millimeters. Thus, a ninety degree phase shift of the signal corresponds to a 1.25 millimeter movement. Accordingly, by transmitting the signals at a motion tracking rate (e.g., once every 10-30 mS), the motion of the player and/or gaming object can be tracked with millimeter accuracy.
  • FIG. 32 is a schematic block diagram of an embodiment of a transmitter 32 that includes a processing module 220, one or more image intensity sensors 222, and an RF transmitter 224. The RF transmitter 224 includes an oscillator 228, a plurality of power amplifiers (PA) 230-232, and a beamforming module 226 coupled to a plurality of antenna structures. The plurality of antenna structures may be configurable antenna structures as discussed in patent application entitled, “INTEGRATED CIRCUIT ANTENNA STRUCTURE”, having a Ser. No. of 11/648,826, and a filing date of Dec. 29, 2006, patent application entitled, “MULTIPLE BAND ANTENNA STRUCTURE, having a Ser. No. of 11/527,959, and a filing date of Sep. 27, 2006, and/or patent application entitled, “MULTIPLE FREQUENCY ANTENNA ARRAY FOR USE WITH AN RF TRANSMITTER OR TRANSCEIVER”, having a Ser. No. of 11/529,058, and a filing date of Sep. 28, 2006, all of which are incorporated herein by reference.
  • The processing module 220 may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Further note that, the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in FIGS. 1-64.
  • In operation, the oscillator 228 provides an oscillation at a desired frequency (e.g., within the ISM band, within the licensed and/or unlicensed RF communication bands of 450 MHz up to 29 MHz, 60 MHz, between microwave and IR frequency bands, etc.). The power amplifiers 230-232 amplify the oscillation to produce outbound signals. The beamforming module 226 adjusts phase and/or amplitude of at least one of the outbound signals to produce an in-air beamformed signal 212. The selection of the phase and/or amplitude focuses the energy of the beamformed signal 212 in a particular direction. As such, by adjusting the phase and/or amplitude of one or more outbound signals, a beamformed signal 212 can be directed in any two or three dimensional direction within the physical area. In addition, the desired frequency of the oscillation may be adjusted to provide a frequency spectrum sweep of the physical area.
  • The one or more image intensity sensors 222 measure the temperature of the objects, which is a function of the reflectivity, emissivity, and transmissivity of the surface of the physical area. Emissivity is the ratio of the radiation intensity of a nonblack body to the radiation intensity of a blackbody. This ratio, which is usually designated by the Greek letter Îľ, is always less than or just equal to one. The emissivity characterizes the radiation or absorption quality of nonblack bodies. Published values are available for most substances. Emissivities vary with temperature and also vary throughout the spectrum. Transmissivity is the ratio of the transmitted radiation to the radiation arriving perpendicular to the boundary between two mediums.
  • For a given beamformed signal, the one or more image sensors provide the temperature of the object(s) to the processing module 220. The processing module 220 accumulates temperatures of the object(s) for various beamformed signals 212 and/or for various frequencies and processes the temperatures in accordance with an image intensity processing algorithm to provide a three dimensional image of the physical area and the objects in it. The image intensity processing algorithm may further include a positioning and/or motion tracking sub routine to establish the positioning and/or motion tracking of a player and/or gaming object within the physical area. Note that the gaming object may be made of one or more materials that makes it readily distinguishable from other objects that may be found in the physical area. For example, it may be made of a combination of metals and plastics in a particular shape.
  • FIG. 33 is a diagram of another method for determining position and/or motion tracking that begins at step 240 by transmitting one of a plurality of beamformed signals. The method continues at step 242 by receiving one or more image intensity signals (e.g., reflectivity, emissivity, and transmissivity of the surface of the physical area) for the given beamformed signal. The method then branches to step 246 and step 248. At step 246, the likely material of the object(s) is determined based on the received one or more image intensity signals. At step 248, the distance to the object(s) is determined based on the received one or more image intensity signals. The method continues at step 248 by determining whether all of the beamformed signals have been processed (e.g., different angles and/or at different frequencies). If not, the process repeats by transmitting one of the beamformed signals.
  • When all of the beamformed signals have been transmitted, the method continues at step 250 by compiling materials and distances to establish an initial model of the physical environment. The method continues at step 252 by identifying the player or players in the physical environment based on the materials. This step may further include identifying a gaming object. The method continues at step 254 by determining the one or more player's position based on the corresponding distances. This step may further include determining the position of a gaming object. Note that this method may be continually performed to track motion of the player and/or gaming object.
  • FIG. 34 is a diagram of another method for determining position and/or motion tracking that may begin at step 260 with the optional step of adjusting frequency (e.g., in the MMW band) of the beamforming signals for optimal human imaging. The method continues at step 262 by updating the beamforming coefficients based on the player's and/or gaming objects position. With this step, or these steps, the transceiver is focused on tracking the motion of the player and/or gaming object.
  • The method continues at step 264 by transmitting one of the beamforming signals and at step 266 by receiving one or more image intensity signals in response to the focused beamformed signal. The method then continues at step 268 by determining a distance to the object based on the received one or more image intensity signals. If all of the beamforming signals have not been transmitting as determined at step 270, the method repeats at step 264 by transmitting the next beamforming signal.
  • When all of the beamformed signals have been transmitted for this interval, the method continues at step 272 by compiling distances to establish the player's and/or gaming objects motion. The method continues at step 274 by determine whether it is time to update the position of the player and/or gaming object. In an embodiment, the motion tracking processing may be repeated every 10-100 mSec and the positioning may be updated once every 1-10 seconds. In general, the positioning may be updated to keep the player and/or gaming object within a desired processing region. For example, with reference to FIG. 26, the motion tracking grid is moved based on the updated positioning such that the focusing of the beamforming signals is concentrated on the motion tracking grid.
  • Returning to the discussion of FIG. 34, when it is not time to update the positioning, the method repeats. If it is time to update the positioning, the method of FIG. 33 may be used.
  • FIG. 35 is a schematic block diagram of an embodiment of a wireless communication between a gaming object 14 and a game console device 12. In this embodiment, the gaming object 14 and the game console device 12 each includes a plurality of antenna structures. The antenna radiation pattern for the plurality of structures may be as shown in FIG. 36.
  • Returning to the discussion of FIG. 35, the gaming object 14 transmits a plurality of signals via the antenna structures, where each of the signals has a different carrier frequency (e.g., f1, f2, etc.). The antennas structures of the game console device 12 are tuned for the different carrier frequencies. For example, a first array of antennas is tuned for a first frequency and a second array of antennas is tuned for a second frequency. Note that the signals may be sinusoidal tones and/or RF communications in accordance with a wireless communication protocol. With the antenna radiation pattern as shown in FIG. 36, the antenna arrays will receive their respective signals with differing signal characteristics (signal strength, phase, beam angle, constructive and destructive interference of the signals, etc.), based on the orientation of the gaming object 14 with respect to the game console device 12. An example of this will described with reference to FIGS. 37 and 38.
  • In this manner, as the characteristics of the respective signals changes, the movement of the gaming object 14 may be determined. Note that in another embodiment, the game console device 12 may transmit the signals and the gaming object 14 determines the signal characteristics.
  • With reference to FIGS. 34-36, both signal frequency and range between the end points of the medium affect the amount of attenuation. In general, attenuation is proportional to the square of the distance between the transmitter and receiver and is proportional to the square of the frequency of the radio signal. For instance, the attenuation increases as the frequency or range increases. Open outdoor attenuation is based on straightforward free space loss formulas, while indoor attenuation is more complex due to signals bounce off obstacles and penetrating a variety of materials that offer varying effects on attenuation. In general, an 802.11b radios operating at 11 Mbps will experience approximately 100 dB of attenuation at about 200 feet.
  • FIG. 37 is a diagram of another embodiment of an antenna radiation pattern for first and second antennas for first and second frequencies. The diagram further illustrates a source position of the transmitted signals. In this example, the f2 antennas are orthogonal with each other an at a 45 degree relationship with the f1 antennas, which are orthogonal to each other.
  • FIG. 38 is a diagram of an example of receiving the RF and/or MMW signals by the various antennas of the antenna arrays. As shown, f1 antennas receive the transmitted RF and/or MMW signal [e.g., A1 cos(ωf1(t))] with different characteristics. The received signals of the f1 antennas are combined to produce a first resulting signal [e.g., A′1 cos(ωf1(t)+θ1+φ1), where A′1 is the received amplitude, θ is the beam angle, and φ is the phase rotation. As is shown, f2 antennas receive the transmitted RF and/or MMW signal [e.g., A2 cos(ωf2(t))] with different characteristics. The received signals of the f2 antennas are combined to produce a second resulting signal [e.g., A′2 cos(ωf2(t)+θ2+φ2). The resulting signals can be processed to determine the beam angle, phase angle, and amplitude of the transmitted signals. From this information, the position and/or motion tracking may be determined.
  • To enhance the positioning and/or motion tracking the attenuation curves of FIG. 39 may be used. As shown, f2 is of a higher frequency and thus attenuates in air more quickly over distance than the f1 signals. Note that more than two carrier frequencies may be used to facilitate the determining of the position and/or motion tracking.
  • FIGS. 40 and 41 are diagrams of an example of frequency dependent distance calculation where the phase difference at different times for different signals is determined. The positioning and/or motion tracking of an object may be done based on the phase difference, the transmission distance, and the frequency of the signals from time to time. For example, at time TX to, the transmitter transmits a signal as shown in FIG. 40. At time RX t0+Δt0-1, the first antenna receives the signal. The phase rotation (e.g., Δφ0-1) of the received signal is determined. At time RX t0+Δt0-2, the second antenna receives the signal. The phase rotation (e.g., Δφ0-2) of the received signal is determined. With a known distance between the first and second antennas, the different phase rotations, and the carrier frequency of the signal, the distance between the transmitter and receiver can be determined. Using the beam angle, the orientation of the distance can be determined.
  • FIGS. 42 and 43 are diagrams of an example of constructive and destructive signaling to facilitate the determination of positioning and/or motion tracking. In this embodiment, At least two antennas physically separated by a known distance transmit different sinusoidal signals [e.g., cos(ωf1(t)) and cos(ωf2(t))]]. In air, the signals combine in a constructive and destructive manner [e.g., cos(ωf1(t))+cos(ωf2 (t))=2*cos1/2(ωf1(t)+ωf2(t))*cos(ωf1(t)−ωf2(t))].
  • An antenna assembly of the gaming object and/or player receives the signals and, based on the constructive and destructive patterns, the distance may be determined. Obtaining multiple distances from multiple sources and knowing the source locations, the position and/or motion of the object can be determined. Such a process may be augmented by using the attenuation properties of a signal in air and/or by using multiple different frequency signals.
  • FIG. 44 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, and a plurality of digital image sensors 290-294 (e.g., digital cameras, digital camcorders, digital image sensor, etc.). The gaming system 10 has an associated physical area in which the gaming object 14 and player 16 are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system 10, the plurality of digital imaging sensors 290-294 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures of an image of the player 16 and/or gaming object 14 within the physical area based on the position of the player and/or gaming object. Note that the digital imaging sensors 290-294 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • The captured images are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the digital image sensors may be positioned and/or adjusted to focus on the player's and/or gaming object's movement. The images captured by the digital image sensors are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player 16 and/or gaming object 14 may include sensors (e.g., blue screen patches, etc.) thereon to facilitate the position and/or motion tracking processing.
  • FIG. 45 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, and a plurality of heat sensors 300-304 (e.g., infrared thermal imaging cameras, infrared radiation thermometer, thermal imager, ratio thermometers, Optical Pyrometer, fiber optic temperature sensor, etc.). The gaming system has an associated physical area in which the game gaming object and player are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system 10, the plurality of heat sensors 300-304 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures a heat image of the player 16 and/or gaming object 14 within the physical area based on the position of the player and/or gaming object. Note that the heat sensors 300-304 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • The captured heat images are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the heat sensors may be positioned and/or adjusted to focus on the player and/or gaming object movement. The heat images captured by the heat sensors are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include sensors thereon to facilitate the position and/or motion tracking processing.
  • FIG. 46 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, and a plurality of electromagnetic sensors 310-314 (e.g., Magnetometers, gauss meters, magnetic field sensors, electromagnetic and EMC/EMI/RFI probes for measuring electromagnetic fields, etc.). The gaming system has an associated physical area in which the game gaming object 14 and player 16 are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system, the plurality of electromagnetic sensors 310-314 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures of an electromagnetic image of the player and/or gaming object within the physical area based on the position of the player and/or gaming object. Note that the electromagnetic sensors 310-314 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • The captured electromagnetic images are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the electromagnetic sensors may be positioned and/or adjusted to focus on the player and/or gaming object movement. The electromagnetic images captured by the electromagnetic sensors are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include sensors thereon to facilitate the position and/or motion tracking processing.
  • FIG. 47 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, and a plurality of laser sensors 320-324 (e.g., Laser Distance Measurement Photoelectric Sensors, digital laser sensor, short range laser sensor, medium range laser sensor, etc.). The gaming system has an associated physical area in which the game gaming object 14 and player 16 are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system, the plurality of laser sensors 320-324 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures laser based relative distances of the player and/or gaming object within the physical area based on the position of the player and/or gaming object. Note that the laser sensors 320-324 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • The relative distances are initially used to determine the position of the gaming object 14 and/or the player 16. Once the player's and/or gaming object's position is determined, the laser sensors may be positioned and/or adjusted to focus on the player and/or gaming object movement. Subsequent relative distances are processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include sensors thereon to facilitate the position and/or motion tracking processing.
  • FIG. 48 is a diagram of another method for determining position and/or motion tracking that begins at steps 330 and 332 by determining the relative position of the player and/or gaming object using two or more positioning techniques (e.g., RF beamforming, laser sensors, etc.) The method continues at step 334 by combining the two or more positions to produce the initial position. Note that the two or more positioning techniques may be weighted based on a variety of factors including, but not limited to, accuracy, distance, interference, availability, etc. Note that one technique may be used to capture the position in one plane (e.g., x-y plane), a second technique may be used to capture the position in a second plane (e.g., x-z plane), and/or a third technique may be used to capture the position in a third plane (e.g., y-z plane).
  • The method continues at steps 336 and 338 by determining the motion of the player and/or gaming object using two or more motion tracking techniques. Note that in many instances the same technique may be used for positioning as for motion tracking, where the motion tracking is done with greater resolution and at a greater rate than the positioning. The method continues at step 340 by combining the two motion tracking values to produce the current motion of the player and/or gaming object. Note that the two or more motion tracking techniques may be weighted based on a variety of factors including, but not limited to, accuracy, availability, speed of movement, interference, distance, user preference, etc. Further note that the motion of a player and/or gaming object may be enhanced by including a positioning and/or motion tracking sensor on the player and/or gaming object.
  • The method continues at step 342 by determining whether the position needs to be updated (e.g., change focus of motion tracking processing). If yes, the method repeats at steps 330 and 332. If not, the method repeats at steps 336 and 338.
  • FIG. 49 is a diagram of another method for determining position and/or motion tracking that begins at step 350 by evaluating the physical environment in which the player and/or gaming object are located. The game console may also be located in the physical environment, which may be a room, a portion of a room, an office, and/or any area in which a player can player a video game. The method continues at step 352 by selecting one or more of a plurality of positioning techniques for determining the position of the player and/or gaming object based on the physical environment.
  • The method continues at step 354 by determining the position of the player and/or gaming object using the one or more positioning techniques. The method continues at step 356 by selecting one or more of motion tracking techniques to determine the motion of the player and/or gaming object based on the environment and/or the position of the player and/or gaming object. The method continues at step 358 by determining the motion of the player and/or gaming object using the selected motion tracking technique(s). The method continues at step 360 by determining whether the position of the player and/or gaming object needs to be updated and repeats as shown.
  • FIG. 50 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, an RFID reader 370, at least one RFID tag 372 associated with the player 16, and at least one RFID tag 372 associated with the gaming object 14. The gaming system has an associated physical area in which the game gaming object and player are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system, the RFID reader 370 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) communicates with the RFID tags 372 to determine distances of the player 16 and/or gaming object 14 within the physical area. This may be done by using the RFID system (e.g., the reader and the tags) as an RF radar system. For example, the RFID system may use a backscatter technique to determine distances between the RFID reader and the RFID tags. In another example, the RFID system may use frequency modulation to compare the frequency of two or more signals, which is generally more accurate than timing the signal. By changing the frequency of the returned signal and comparing that with the original, the difference can be easily measured.
  • As another example, the RFID system may use a continuous wave radar technique. In this instance, a “carrier” radar signal is frequency modulated in a predictable way, typically varying up and down with a sine wave or sawtooth pattern at audio frequencies or other desired frequency. The signal is then sent out from one antenna and received on another and the signal can be continuously compared. Since the signal frequency is changing, by the time the signal returns to the source the broadcast has shifted to some other frequency. The amount of that shift is greater over longer times, so greater frequency differences mean a longer distance. The amount of shift is therefore directly related to the distance traveled, and can be readily determined. This signal processing is similar to that used in speed detecting Doppler radar.
  • The distances are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the RFID system may be adjusted to focus on the player and/or gaming object movement. Subsequently determined distances are processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or of the player.
  • FIG. 51 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, a plurality of RFID readers 370, at least one RFID tag 372 associated with the player 16, and at least one RFID tag 372 associated with the gaming object 14. The gaming system 10 has an associated physical area in which the game gaming object and player are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and the game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system, one or more of the RFID readers 370 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) communicates with one or more of the RFID tags 372 to determine the distances of the player 16 and/or gaming object 14 within the physical area. This may be done by using the RFID system (e.g., the readers and the tags) as an RF radar system. For example, the RFID system may use a backscatter technique to determine distances between the RFID reader and the RFID tags. In another example, the RFID system may use frequency modulation to compare the frequency of two or more signals, which is generally more accurate than timing the signal. By changing the frequency of the returned signal and comparing that with the original, the difference can be easily measured.
  • As another example, the RFID system may use a continuous wave radar technique. In this instance, a “carrier” radar signal is frequency modulated in a predictable way, typically varying up and down with a sine wave or sawtooth pattern at audio frequencies or other desired frequency. The signal is then sent out from one antenna and received on another and the signal can be continuously compared. Since the signal frequency is changing, by the time the signal returns to the source the broadcast has shifted to some other frequency. The amount of that shift is greater over longer times, so greater frequency differences mean a longer distance. The amount of shift is therefore directly related to the distance traveled, and can be readily determined. This signal processing is similar to that used in speed detecting Doppler radar.
  • The distances are initially used to determine the position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the RFID system may be adjusted to focus on the player and/or gaming object movement. Subsequently determined distances are processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or of the player.
  • FIG. 52 is a schematic block diagram of a side view of another embodiment of a gaming system 10 that includes a game console device 12, a gaming object 14, one or more RFID readers 370, a plurality of RFID tags 372 associated with the player 16, and a plurality of RFID tags 372 associated with the gaming object 14. The gaming system has an associated physical area in which the gaming object and player are located.
  • In this illustration, the player 16 and the gaming object 14 are within the determined relative position 378. To track the player's and gaming object's motion with the relative position 378, the one or more RFID readers 370 transmits an RFID reader transmission 374, which may be in accordance with an RF radar transmission as discussed above. Alternatively, the RFID reader transmission 374 may be request for at least one of the RFID tags 372 to provide a response regarding information to determine its position or distance with reference to a particular point.
  • The RFID tags provide an RFID tag response 376, which may be in accordance with the RF radar transmissions discussed above. Alternatively, the RFID tags may provide a response regarding information to determine its position or its distance to a reference point. The communication between the RFID reader(s) and RFID tags may be done in a variety of ways, including, but not limited to, a broadcast transmission and a collision detection and avoidance response scheme, in a round robin manner, in an ad hoc manner based on a desired updating rate for a given RFID tag (e.g., a slow moving tag needs to be updated less often than a fast moving tag), etc.
  • FIG. 53 is a schematic block diagram of an embodiment of an RFID reader 370 in the game console device 12 and an RFID tag 372 in the gaming object 14. The RFID reader 370 includes a protocol processing module 380, an encoding module 382, a digital to analog converter 384, an RF front-end 386, a digitization module 388, a pre-decoding module 390, and a decoding module 392. The RFID tag 372 includes a power generating circuit 394, an envelop detection module 396, an oscillation module 398, an oscillation calibration module 400, a comparator 402, and a processing module 404. The details of the RFID reader 370 are disclosed in patent application entitled RFID READER ARCHITECTURE, having a Ser. No. of 11/377,812, and a filing date of Mar. 16, 2006 and the details of the RFID tag 372 are disclosed in patent application entitled POWER GENERATING CIRCUIT, having a Ser. No. of 11/394,808, and a filing date of Mar. 31, 2006. Both patent applications are incorporated herein by reference.
  • FIG. 54 is a diagram of a method for determining position of a player and/or gaming object that begins at step 410 with an RFID reader transmitting a power up signal to one or more RFID tags, which may be active or passive tags. The power up signal may be a tone signal such that a passive RFID tag can generate power therefrom. The power up signal may be a wake-up signal for an active RFID tag. The method continues at step 412 with the RFID tag providing an acknowledgement that it is powered up. Note that this step may be skipped.
  • The method continues at step 414 with the RFID reader transmitting a command at time t0, where the command requests a response to be sent at a specific time after receipt of the command. In response to the command, an RFID tag provides the response and, at step 416, the reader receives it. The method continues at step 418 with the RFID reader recording the time and the tag ID. The method continues at step 420 with the reader determining the distance to the RFID tag based on the stored time, time t0, and the specific time delay.
  • The method continues at step 422 by determining whether all or a desired number of tags have provided a response. If not, the method loops as shown. If yes, the method continues at step 424 by determining the general position of the player and gaming object based on the distances. As an alternative, the general position of each of the tags may be determined from their respective distances at step 426. Note that at least three, and preferably four, distances need to be accumulated from different sources (e.g., multiple RFID readers or an RFID reader with multiple physically separated transmitters) to triangulate the RFID tag's position.
  • FIG. 55 is a schematic block diagram of an embodiment of a gaming object 14 that includes an integrated circuit (IC) 434, a gaming object transceiver 432 and a processing module 430. The IC 434 includes one or more of an RFID tag 446, a servo motor 448, a received signal strength indicator 444, a pressure sensor 436, an accelerometer 438, a gyrator 440, an LPS receiver 442, and an LPS transmitter 445. Note that if the gaming object 14 is an item worn by the player to facilitate playing a video game, the gaming object 14 may not include the processing module 430 and/or the gaming object transceiver 432.
  • The RFID tag is coupled to one or more antenna assemblies and the gaming object transceiver is also coupled to one or more antenna assemblies. In this instance, the RFID tag may communicate with an RFID reader using one or more carrier frequencies to facilitate positioning and/or tracking as described above. In addition to, or in the alternative, the RFID tag may provide the communication path for data generated by the RSSI module, the servo motor, the pressure sensor, the accelerometer, the gyrator, the LPS receiver, and/or the LPS transmitter. Details of including a gyrator or pressure sensor on an IC is provided in patent application entitled GAME DEVICES WITH INTEGRATED GYRATORS AND METHODS FOR USE THEREWITH, having a Ser. No. of 11/731,318, and a filing date of Mar. 29, 2007 and patent application entitled RF INTEGRATED CIRCUIT HAVING AN ON-CHIP PRESSURE SENSING CIRCUIT, having a Ser. No. of 11/805,585, and a filing date of May 23, 2007. Both patent applications are incorporated herein by reference.
  • The RFID tag may use a different frequency than the gaming object transceiver for RF communications or it may use the same, or nearly the same, frequency. In the latter case, the frequency spectrum may be shared using a TDMA, FDMA, or some other sharing protocol. If the RFID tag and the gaming object transceiver share the frequency spectrum, they may share the antenna structures. Note that the antenna structures may be configurable as discussed in patent application entitled, “INTEGRATED CIRCUIT ANTENNA STRUCTURE”, having a Ser. No. of 11/648,826, and a filing date of Dec. 29, 2006, patent application entitled, “MULTIPLE BAND ANTENNA STRUCTURE, having a Ser. No. of 11/527,959, and a filing date of Sep. 27, 2006, and/or patent application entitled, “MULTIPLE FREQUENCY ANTENNA ARRAY FOR USE WITH AN RF TRANSMITTER OR TRANSCEIVER”, having a Ser. No. of 11/529,058, and a filing date of Sep. 28, 2006, all of which are incorporated herein by reference.
  • FIG. 56 is a schematic block diagram of an embodiment of three-dimensional antenna structure 350 that includes at least one antenna having a radiation pattern along each of the three axes (x, y, z). Note that the 3D antenna structure 350 may include more than three antennas having radiation patterns at any angle within the three-dimensional space. Note that the antennas may be configurable antennas as previously discussed to accommodate different frequency bands. FIG. 57 is a diagram of an example of an antenna radiation pattern 352 for one of the antennas of the antenna structure 350 of FIG. 56.
  • FIGS. 58 and 59 are diagrams of an example of frequency dependent motion calculation where a signal (TX) is received at time tn and another signal (TX) is received at time tn+1, where n is any number. As shown in FIG. 58, the signal is received with respect to the xy plane and with respect to the xz plane by the three antennas of FIG. 56. In this configuration, each antenna will receive the signal with a different amplitude (and may be a different phase as well) due to its angle with respect to the source of the signal. From these differing received signals, the angular direction of the source with respect to the 3D antenna structure can be determined. To determine the distance between the 3D antenna structure and the source, one or more of the distance determination techniques discussed herein may be used (e.g., attenuation of the magnitude of the transmitted signal). With the distance and angle known, the position of the 3D antenna structure, which may be affiliated with a player and/or gaming object, can be determined for time tn.
  • FIG. 59 shows the signal being received at time tn+1, which is at a different angle than the signal transmitted at time tn. The differing received signals by the antennas are used to determine the angular position of the source and one or more of the distance determination techniques discussed herein may be used to determine the distance to the source. From the known angular position and the known distances, the position of the 3D antenna structure may be determined for time tn+1. Comparing the position of the 3D antenna structure at time tn with its position at time tn+1 yields its motion.
  • FIG. 60 is a diagram of a method for determining motion that begins at step 360 by transmitting an RF signal at time tn. The RF signal may be a narrow pulse, may be a sinusoidal signal, and/or may be an RF transmission in accordance with a wireless communication protocol. The method continues at step 362 with the 3D antenna structure receiving the RF signal. The method continues at step 364 by determining a 3D vector of the received RF signal. An example of this is shown in FIG. 61.
  • The method continues at step 366 by transmitting another RF signal at time tn+1. The method continues at step 368 with the 3D antenna structure receiving the RF signal. The method continues at step 370 by determining a 3D vector of the received RF signal. An example of this is shown in FIG. 61. The method continues at step 372 by determining the motion of the player and/or gaming object by comparing the two 3D vectors. This process continues for each successive tn and tn+1 combination. Note that the duration between tn and tn+1 may vary depending on one or more of the video game being played, the speed of motion, the anticipated speed of motion, the quality of the motion estimation, and/or motion prediction algorithms, etc.
  • FIG. 62 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a game console device 12, a player 16, a gaming object 14, and a plurality of directional microphones 280. The gaming system has an associated physical area in which the game gaming object and player are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object 14 and game console device 12 are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.). The game console device 12 may be in the physical area or outside of the physical area, but electronically connected to the physical area via a WLAN, WAN, telephone, DSL modem, cable modem, etc.
  • In this system, the plurality of directional microphones 380-382 periodically (e.g., in the range of once every 1 millisecond to once every 10 seconds) captures audible, near audible, and/or ultrasound signals (together, acoustic waves) of the player 16 and/or gaming object 14 within the physical area. Note that the directional microphones 380-382 may be continually repositioned to determine the player's and/or gaming object's position and/or to track the motion of the gaming object and/or player.
  • The captured audible, near audible, and/or ultrasound signals are used to determine the initial position of the gaming object and/or the player. Once the player's and/or gaming object's position is determined, the directional microphones 380-382 may be positioned and/or adjusted to focus on the player and/or gaming object movement. The captured signals are then processed using a two-dimension and/or three-dimension algorithm to determine the motion of the gaming object and/or the player. Note that the player and/or gaming object may include near audible and/or ultrasound signal generators thereon facilitating the position and/or motion tracking processing.
  • FIG. 63 is a diagram of an example of audio, near audio, and ultrasound frequency bands that may be used by the system of FIG. 63. In this example, a positioning tone (e.g., a sinusoidal signal) has a frequency just above the audible frequency range (e.g., at 25-35 KHz) and/or in the ultrasound frequency band, which are within the bandwidth of the microphone. Thus, the microphones may serve a dual purpose: capturing audio for normal game play, game set up, game authentication, player authentication, gaming object authentication, and for position determination and motion tracking. In an embodiment, the gaming object and/or the player may transmit a near audible signal (e.g., a tone at 25 KHz), which is above the audible frequency range, but within the bandwidth of the directional microphones 380-382. The directional microphones may adjust their position to focus in on the source of the tone. The angular positioning and the intersection thereof may be used to determine the location of the gaming object and/or the player.
  • FIG. 64 is a schematic block diagram of an overhead view of another embodiment of a gaming system 10 that includes a gaming object 14, a player 15, a directional microphone390, and a game console device 12. In this embodiment, the game console device 12 and/or the game object 14 include one or more directional microphones (and may include transmitters) 390 that have their orientation adjusted based on the position and/or motion of the gaming object to better receive an audible signal from the gaming object, player, and/or the game console device. The gaming object 14 may also include a sound energy transmitter. In a first operation, based upon receipt of a sound energy signal from the gaming object by the directional microphones 390 and subsequent processing, a position of the gaming object may be determined. In a second operation, sound energy from the transmitters 390 is reflected by the user and/or the gaming object. Based upon the receipt of such sound energy, a position of the gaming object and/or the user may be determined.
  • FIG. 65 is a schematic block diagram of an overhead view of still another embodiment of a position location system in accordance with the present invention. The position location system of FIG. 65 includes first position location sub-system, second position location sub-system, and processing circuitry that is coupled to the first position location sub-system and to the second position location sub-system. The position location system illustrated in FIG. 65 is deployed within a gaming environment 6502. However, in other embodiments, the position location system of FIG. 65 may be deployed in a physical area that does not support gaming. In such case, the position location system of FIG. 65 would simply be used to locate objects other than gaming objects within the physical area.
  • With the particular embodiment of FIG. 65, the video gaming system 6500 supports video gaming within the video gaming environment 6502 of the physical area. Consistently with the previous description made herein, the video gaming system 6500 includes a game console 12, a gaming object 14, and a gaming object 15. During play, a player 16 holds one or both the gaming objects 14 and 15 and the position location system performs position 18 and motion tracking 20 of the objects 14 and/or 15 and/or the player 16. By performing such position 18 and motion tracking 20, the position location system supports the player 16 interacting with a gaming function supported by game console 12.
  • The position location system of FIG. 65 includes at least two position location sub-systems, at least two of which use differing position location techniques. For example, in the embodiment of FIG. 65, the position location system includes first position location sub-system having position location sub-system components 6504A, 6504B, and 6504C. Likewise, the position location system includes a second position location sub-system having position location sub-system components 6506A, 6506B, and 6506C. As is illustrated in FIG. 65, each of these position location sub-system components couples to the game console 12 via wired and/or wireless communication links. The game console 12 includes processing circuitry that receives position location information from the at least two position location sub-systems and processes such information to locate one or more of gaming object 14, gaming object 15, and player 16 within the gaming environment 6502. In a non-gaming embodiment, processing circuitry locates one or more objects within the physical area without supporting gaming.
  • As is shown with the system of FIG. 65, each of the first position location sub-system and the second position location sub-system may each include a plurality of receivers that orient about the gaming environment/physical area 6502. Further, with the embodiment of FIG. 65, the first position location sub-system may include at least one transmitter. The at least one transmitter may be located in conjunction with the gaming object 14 or 15 and/or may be co-located with the various receivers of the first and second position location sub-systems that are located about the gaming area 6502. In such case, position location sub-system components are distributed about the gaming area, e.g., 6504A-6506C and may have substantially co-located receivers and transmitters.
  • According to one aspect of the position location system of FIG. 65, the first position location sub-system uses a first position location technique while the second position location sub-system uses a second position location technique that differs from the first position location technique. The various position location techniques that may be employed by the first and second position location sub-systems of FIG. 65 have been described previously herein with reference to FIGS. 1-64. Generally, these techniques include one or more of acoustic wave detection, RF signal detection, digital imaging, IR detection, laser distance measurement, thermal imaging, and/or multiple access accelerometer sensing.
  • When using acoustic wave detection technique with the system of FIG. 65, the object 14 and/or 15 may include at least one acoustic energy source, e.g., ultrasonic transmitter, and the first position location sub-system may include a plurality of sound energy receivers that are located about the gaming environment 6502.
  • With one example of use of RF signal detection, the object 14 or 15 includes at least one RF transmitter and one or more of the first and second position location sub-systems include a plurality of receivers. With a second example of use of RF signal detection, the object 14 or 15 includes at least one RF receiver and the first and/or the second position location sub-systems include a plurality of RF transmitters. With another example of use of RF signal detection, the first and/or second position location sub-systems include at least one RF transmitter and a plurality of RF receivers.
  • With a first embodiment of the use of digital imaging, one or more gaming objects 14 and/or 15 include(s) a plurality of digital cameras. This technique, as was previously described herein, uses the digital cameras of the gaming objects 14 and/or 15 to recognize reference points in the gaming environment 6502 and to determine position(s) of the object(s) 14 and/or 15s position based upon these reference points. With another embodiment using digital imaging, the first and/or second position location sub-systems include a plurality of digital cameras. The first and/or second position location sub-systems identify reference points, including object reference points, captured in the digital images to locate gaming object 14 and/or 15.
  • With an embodiment of the system of FIG. 65 using IR detection, the object may include an IR source and the first and/or second position location sub-systems include a plurality of IR receivers. With still another embodiment of using IR detection, the first and/or second position location sub-systems include at least one IR source and a plurality of IR receivers. Any of these various techniques may be employed with the position location sub-systems and illustrated further herein with reference to FIGS. 66-73.
  • Operations of the video gaming system 6500 of FIG. 65 will be described further herein with reference to FIGS. 67-73.
  • FIG. 66 is a schematic block diagram of an overhead view of yet another embodiment of a position location system in accordance with the present invention. The FIG. 66, a position location system includes a first position location sub-system, a second position location sub-system, and processing circuitry that couples to the first position location sub-system and to the second position location sub-system. The first position location sub-system includes a plurality of position location sub-system components 6604A, 6604B, and 6604C. The second position location sub-system includes a plurality of position location sub-system components 6606A, 6606B, and 6606C. The position location sub-system(s) components couple to the processing circuitry of game console 12 via a wired and/or wireless communication link.
  • With the embodiment of FIG. 66, the first position location sub-system is operable to determine first position location information regarding a first gaming object 14 using a first position location technique. The second position location sub-system is operable to determine second position location information regarding a second object 52. The second position location sub-system uses a second position location technique that differs from the first position location technique. As was previously described with reference to FIGS. 1-65, various position location techniques may be employed in accordance with the present invention. Generally, the first position location sub-system includes components 6604A-6604C that use a position location technique that differs from a second position location technique used by second position location sub-system that includes component 6606A-6606C.
  • The game console 12 includes processing circuitry coupled to both the first position location sub-system and to the second location sub-system via wired and/or wireless couplings. The processing circuitry of gaming console 12 processes the first position location information to determine a position of the first object 14 within a coordinate system. Further, the processing circuitry processes the second position location information to determine a position of the second object 52 within the coordinate system. As was previously described herein with reference to FIGS. 1 - 64, the coordinate system is associated with the physical environment within which the position location system is deployed. When the position location system operates in conjunction with a video gaming system, the coordinate system in which objects 14 and 52 are located is related to a video gaming function. In such case, the location of gaming objects 14 and 52 are related to the gaming environment to incorporate the gaming functions and operations as well as positions of players 16 and 50 into the video game function. Thus, with the system of FIG. 66, the first position location sub-system is used to determine position 18 and motion track 20 player 16 and/or gaming object 14. Further, the second position location sub-system is employed to determine position 54 and/or motion tracking 56 of player 50 and/or gaming object 52.
  • As was previously described herein, the coordinate system used with the system of FIG. 66 may include a three-dimensional Cartesian coordinate system or a spherical coordinate system. Further as was the case with the system of FIG. 65, the position location sub-systems of FIG. 66 include receivers and/or transmitters deployed about a physical area that are operable to locate players 16 and 50 and/or gaming objects 14 and/or 52 within the physical area. Further operations of the position location system of FIG. 66 will be described further herein with reference to FIGS. 71-73. Generally, the operations described herein with reference to FIGS. 67-73 may be employed with one or both of the systems of FIGS. 65 and 66.
  • FIG. 67 is a flow chart illustrating operations of a position location system employing multiple position location techniques. With the operation of FIG. 67, the position location system first evaluates its physical environment (Step 670). In evaluating the physical environment at Step 670, the position location sub-system may perform calibration operations. Such calibration operations may be performed according to techniques previously described herein and that will be described herein with reference to FIG. 68.
  • Operation proceeds with capturing first position location information regarding the object using a first position location sub-system that uses a first position location technique (Step 672). The system of FIGS. 65 and/or 66 may be employed with the operations of FIG. 67 to locate the object using first position location sub-system. Operation proceeds to capturing second position location information regarding the object by a second position location sub-system using the second position location technique (Step 674). Then, processing circuitry or another processing device processes the first position location information and the second position location information to determine a position of the object within a coordinate system (Step 676). The coordinate system may correspond to a gaming environment, a factory, an office, a shopping mall, or any other physical area within which objects may be located. Then, for the embodiments of a gaming system the positions of the object within the coordinate system is integrated into a video game function (Step 678).
  • According to various embodiments of Step 676, the first position location information and the second position location information are used in differing manners. For example, the first position location information may be used as primary information to locate the object while the second position location information may be used as secondary information to locate the object. With this operation, the second position location information is used as a safe guard or resolution enhancement to error check or increase resolution the first position location information. Further, the second position location information may be used to calibrate the first position location information. Such calibration may occur at startup and/or during standard intervals of operation of the position location system.
  • In other embodiments, the second position location information is simply used to augment the first position location information. An example of augmentation use of the second position location information occurs when the first position location information is interrupted intermittently or infrequently. In such case, the second position location information would fill-in the missing first position location information. Further, augmentation of the first position location information with the second position location information occurs at different points in the gaming operation when additional resolution, enhanced motion detection, greater location position, or another operation occurs in which a single position location technique is insufficient for the current demands.
  • FIG. 68 is a flow chart illustrating usage of multiple position location techniques for locating an object. As shown in FIG. 68, operation begins with evaluating the physical environment (Step 680). The first position location sub-system captures the first position location information regarding the object using a first position location technique (Step 682). Then, the second position location sub-system captures the second position location information regarding the object using the second position location technique (Step 684). The processing circuitry then calibrates the first position location system using the second position location information to produce calibration settings (Step 686). From Step 686, operation ends. Note that the operations of FIG. 68 may be employed at startup, periodically, or when a lack of acceptable calibration is detected by the position locations system.
  • FIG. 69 is a flow chart illustrating usage of multiple position location techniques for determining position and motion of an object. The operations of FIG. 69 commence with position location systems evaluating the physical environment within which the position location system operates (Step 690). The first position location sub-system then captures the first position location information regarding the object using the first position location technique (Step 692). Operation proceeds to capturing second position location information regarding the object by the second position location sub-system using a second position location technique (Step 694). After the first and second position location is captured, the processing circuitry determines the position of the object using the first position location information (Step 696). Then, the processing circuitry determines a motion of the object using the second position location information (Step 698).
  • One particular alternate embodiment of the operations of FIG. 69 includes using one position location technique that is very good at determining the position of the object but not as good at determining motion. One example of such operation is using an acoustic wave detection technique to locate at an object within a gaming environment but to use multiple access accelerometer sensing to determine motion of the object. Likewise, an RF signal detection technique could be used to locate the object while using the multiple access accelerometer to detect motion of the object. In such case, a very high quality capture of both position and motion would result.
  • FIG. 70 is a flow chart illustrating operation for using multiple position location techniques to determine position and orientation of an object. The operations of FIG. 70 commence with the position location system evaluating the physical environment (Step 700). Then, the first position location sub-system captures first position location information regarding the object using the first position location technique (Step 702). The second position location sub-system then captures second position location information regarding the object using a second position location technique (Step 704). The processing circuitry then determines the position of a first reference point of the object using the first position location information (Step 706).
  • The processing circuitry next determines the position of a second reference point of the object using the second position location information (Step 708). Then, the processing circuitry determines a position of the object using the first and/or second position location information (Step 706). Finally, the position location system determines an orientation of the object using the first and/or second position location information (Step 708).
  • As was previously shown with reference to FIG. 4, the gaming object 14 may include multiple reference points and the player 16 may wear a plurality of sensing tags 44. Using the position location system illustrated in FIGS. 65 and/or 66, the various reference points, e.g., sensing tags 44 worn by player 16 and/or multiple reference points of gaming object 14 may be separately tracked using two different position location techniques. In such case, one reference point, e.g., a sensing tag 44 located on an arm or head of the player 16 may be used to locate the player while information captured regarding differing sensing tags 44 of the player 16 may be used to determine an orientation of the player within the gaming environment. Likewise, when the gaming object 14 includes multiple reference points, e.g., multiple sensing tags 44, the first position location technique may be used to determine a position of one of the sensing tags 44 and the second position location technique may be used to determine location of a second sensing tag on the gaming object 14. In combination, using the two position location techniques, both the position and orientation of gaming object 14 may be determined.
  • FIG. 71 is a flow chart illustrating operation for using multiple position location techniques to determine positions of multiple objects. The operation of FIG. 71 commences with the position location system evaluating a physical environment in which the position location system is deployed (Step 710). Operation continues with the first position location sub-system capturing first position location information regarding a first object using a first position location technique (Step 712). Operation continues with the second position location sub-system capturing second position location information regarding a second object using a second position location technique (Step 714). The processing circuitry of the position location system then processes the first position location information to determine a position of the first object within a coordinate system (Step 716). The coordinate system would have been established at Step 710 and may be included with a video game function as has been previously described in great detail with reference to the present invention.
  • Operation continues with the processing circuitry processing the second position location information to determine a position of a second object within the coordinate system (Step 718). A system in which multiple gaming object positions are tracked was previously described herein with reference to FIG. 66. Operation continues in FIG. 71 with the processing circuitry integrating the positions of the first and second objects within the coordinate system into a video game function (Step 719). The video game function operations will be employed when the position location sub-system operates in conjunction with the video game function. When the position location system is not used in conjunction with the video game function, Step 719 would not occur.
  • FIG. 72 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects. The operation of FIG. 72 commence with the position location system evaluating the physical environment (Step 720). In evaluating the physical environment at Step 720, the position location system may establish coordinate system within a physical environment. Then, operation continues with the first position location sub-system capturing first position location information regarding a first object using a first position location technique (Step 722). Operation continues with the second position location sub-system capturing second position location information regarding a second object using a second position location technique (Step 723). Then, the processing circuitry or gaming console processes the first position location information to determine a position of the first object within a coordinate system (Step 724).
  • Operation continues with the position location system processing second position location information to determine a position of the second object within the coordinate system (Step 725). The processing circuitry then determines motion of the first object using the first position location information (Step 726). Finally, the processing circuitry determines a motion of the second object using the second position location information (Step 727). With the operations of FIG. 72, the first position location sub-system operates solely upon the first object while the second position location sub-system operates solely upon the second object. In such case, the first position location sub-system may locate multiple reference points on the object (or the player) for subsequent processing. Further, the second position location sub-system may locate multiple reference points on the second object (or player) for subsequent processing.
  • FIG. 73 is a flow chart illustrating operation for using multiple position location techniques to determine position and motion of multiple objects. Referring now to FIG. 73, the position location sub-system evaluates the physical environment within which the position location sub-system is deployed (Step 730). The first position location sub-system then captures first position location information regarding the first object using a first position location technique (Step 732). The second position location sub-system then captures the second position location information regarding a second object using second position location technique (Step 733). The processing circuitry of the position location sub-system then processes the first position location information to determine a position of the first object within the coordinate system (Step 734). The processing circuitry next processes the second position location information to determine a position of the second object within the coordinate system (Step 735).
  • Operation continues with the processing circuitry determining a motion of the second object using the first position location information (Step 736). Finally, operation concludes with the processing circuitry determining a motion of the first object using the second position location information (Step 737). In contrast to the operations of FIG. 72, the operations of FIG. 73 use a cross position location technique on common objects. In such case, a first position location technique is used to locate an object while a second position location technique is used to detect motion of the object. In such case, even though only two position location sub-systems are included with the position location system, cross technique benefits are provided for multiple gaming objects tracking purposes.
  • As may be used herein, the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences. As may also be used herein, the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As may further be used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two items in the same manner as “coupled to.” As may even further be used herein, the term “operable to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items. As may still further be used herein, the term “associated with,” includes direct and/or indirect coupling of separate items and/or one item being embedded within another item. As may be used herein, the term “compares favorably,” indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention. One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.

Claims (27)

1. A position location system comprising:
a first position location sub-system operable to determine first position location information regarding an object using a first position location technique;
a second position location sub-system operable to determine second position location information regarding the object, the second position location sub-system using a second position location technique that differs from the first position location technique; and
processing circuitry coupled to the first position location sub-system and to the second position location sub-system and operable to process the first position location information and the second position location information to determine position of the object within a coordinate system.
2. The position location system of claim 1, wherein:
the first position location sub-system includes a plurality of receivers for orientation about a physical area; and
the second position location sub-system includes a plurality of receivers for orientation about the physical area.
3. The position location system of claim 2, wherein the first position location sub-system includes at least one transmitter.
4. The position location system of claim 2, wherein the at least one transmitter and at least one receiver of the plurality of receivers of the first position location sub-system are substantially co-located.
5. The position location system of claim 1, wherein processing circuitry is operable to:
process the first position location information as primary information to locate the object; and
process the second position location information as secondary information to locate the object.
6. The position location system of claim 1, wherein the processing circuitry is operable to process the second position location information to calibrate the first position location information.
7. The position location system of claim 1, wherein the processing circuitry is operable to process the second position location information to augment the first position location information.
8. The position location system of claim 1, wherein the processing circuitry is operable to:
process the first position location information to determine a position of at least one first reference point on the object; and
process the second position location information to determine a position of at least one second reference point on the object.
9. The position location system of claim 8, wherein the processing circuitry is operable to determine a position and orientation of the object based upon the first position location information and the second position location information.
10. The position location system of claim 1, wherein the processing circuitry is operable to:
process the first position location information to determine the position of the object within the coordinate system; and
process the second position location information to determine motion of the object within the coordinate system.
11. The position location system of claim 1, wherein the first position location technique and the second position location technique are selected from the group consisting of:
acoustic wave detection, wherein the object includes at least one sound energy source and the first position location sub-system includes a plurality of sound energy receivers;
Radio Frequency (RF) signal detection, wherein the object includes at least one RF transmitter and the first position location sub-system includes a plurality of RF receivers;
RF signal detection, wherein the object includes at least one RF receiver and the first position location sub-system includes a plurality of RF transmitters;
RF signal detection, wherein the first position location sub-system includes at least one RF transmitter and a plurality of RF receivers;
digital imaging, wherein the object includes a plurality of digital cameras;
digital imaging, wherein the first position location sub-system includes a plurality of digital cameras;
Infrared (IR) detection wherein the object includes an IR source and the first position location sub-system includes a plurality of IR receivers;
IR detection, wherein the first position location sub-system includes at least one IR source and a plurality of IR receivers;
laser distance measurement;
thermal imaging; and
multiple axis accelerometer sensing.
12. A method for locating an object within a physical area comprising:
capturing first position location information regarding the object using a first position location sub-system using a first position location technique;
capturing second position location information regarding the object using a second position location sub-system using a second position location technique that differs from the first position location technique; and
processing the first position location information and the second position location information to determine a position of the object within a coordinate system.
13. The method of claim 12, further comprising:
processing the first position location information as primary information to locate the object; and
processing the second position location information as secondary information to locate the object.
14. The method of claim 12, further comprising processing the second position location information to calibrate the first position location information.
15. The method of claim 12, further comprising processing the second position location information to augment the first position location information.
16. The method of claim 12, further comprising:
processing the first position location information to determine a location of at least one first reference point on the object; and
processing the second position location information to determine a location of at least one second reference point on the object.
17. The method of claim 16, further comprising determining a position and orientation of the object based upon the first position location information and the second position location information.
18. The method of claim 12, further comprising:
processing the first position location information to determine the position of the object within the coordinate system; and
processing the second position location information to determine motion of the object within the coordinate system.
19. The method of claim 12, wherein the first position location technique and the second position location technique are selected from the group consisting of:
acoustic wave detection, wherein the object includes at least one sound energy source and the first position location sub-system includes a plurality of sound energy receivers;
Radio Frequency (RF) signal detection, wherein the object includes at least one RF transmitter and the first position location sub-system includes a plurality of RF receivers;
RF signal detection, wherein the object includes at least one RF receiver and the first position location sub-system includes a plurality of RF transmitters;
RF signal detection, wherein the first position location sub-system includes at least one RF transmitter and a plurality of RF receivers;
digital imaging, wherein the object includes a plurality of digital cameras;
digital imaging, wherein the first position location sub-system includes a plurality of digital cameras;
Infrared (IR) detection wherein the object includes an IR source and the first position location sub-system includes a plurality of IR receivers;
IR detection, wherein the first position location sub-system includes at least one IR source and a plurality of IR receivers;
laser distance measurement;
thermal imaging; and
multiple axis accelerometer sensing.
20. A video gaming system comprising:
a first position location sub-system to determine first position location information regarding a gaming object using a first position location technique;
a second position location sub-system to determine second position location information regarding the gaming object, the second position location sub-system using a second position location technique that differs from the first position location technique; and
a gaming console coupled to the first position location sub-system and to the second position location sub-system and operable to:
process the first position location information and the second position location information to determine a position of the object within a coordinate system; and
integrate the position of the object within the coordinate system within a video game function.
21. The video gaming system of claim 20, wherein the coordinate system comprises at least one of:
a three-dimensional Cartesian coordinate system; and
a spherical coordinate system.
22. The video gaming system of claim 20, wherein the gaming console is further operable to process the first position location information and the second position location information to determine motion of the object within the coordinate system.
23. The video gaming system of claim 20, wherein:
the first position location sub-system includes a plurality of receivers for orientation about a physical area;
the second position location sub-system includes a plurality of receivers for orientation about the physical area.
24. The video gaming system of claim 20, wherein the gaming console is operable to:
process the first position location information as primary information to locate the object; and
process the second position location information as secondary information to locate the object.
25. The video gaming system of claim 20, wherein the gaming console is operable to process the second position location information to calibrate the first position location information.
26. The video gaming system of claim 20, wherein the gaming console is operable to:
process the first position location information to determine the position of the object within the coordinate system; and
process the second position location information to determine motion of the object within the coordinate system.
27. The video gaming system of claim 20, wherein the first position location technique and the second position location technique are selected from the group consisting of:
acoustic wave detection, wherein the object includes at least one sound energy source and the first position location sub-system includes a plurality of sound energy receivers;
Radio Frequency (RF) signal detection, wherein the object includes at least one RF transmitter and the first position location sub-system includes a plurality of RF receivers;
RF signal detection, wherein the object includes at least one RF receiver and the first position location sub-system includes a plurality of RF transmitters;
RF signal detection, wherein the first position location sub-system includes at least one RF transmitter and a plurality of RF receivers;
digital imaging, wherein the object includes a plurality of digital cameras;
digital imaging, wherein the first position location sub-system includes a plurality of digital cameras;
Infrared (IR) detection wherein the object includes an IR source and the first position location sub-system includes a plurality of IR receivers;
IR detection, wherein the first position location sub-system includes at least one IR source and a plurality of IR receivers;
laser distance measurement;
thermal imaging; and
multiple axis accelerometer sensing.
US12/142,702 2007-06-22 2008-06-19 Position location system using multiple position location techniques Abandoned US20080318595A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/142,702 US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US93672407P 2007-06-22 2007-06-22
US12/142,702 US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques

Publications (1)

Publication Number Publication Date
US20080318595A1 true US20080318595A1 (en) 2008-12-25

Family

ID=40135930

Family Applications (26)

Application Number Title Priority Date Filing Date
US12/125,154 Abandoned US20090017910A1 (en) 2007-01-31 2008-05-22 Position and motion tracking of an object
US12/128,785 Expired - Fee Related US7973702B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple HCF transmissions
US12/128,810 Expired - Fee Related US8031121B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple antennas
US12/128,797 Abandoned US20080318689A1 (en) 2007-06-22 2008-05-29 Local positioning system and video game applications thereof
US12/131,331 Active 2034-07-30 US9523767B2 (en) 2007-06-22 2008-06-02 Game console and gaming object with motion prediction modeling and methods for use therewith
US12/131,605 Abandoned US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith
US12/131,480 Abandoned US20080318680A1 (en) 2007-06-22 2008-06-02 Gaming object and gaming console that communicate user data via backscattering and methods for use therewith
US12/131,579 Active 2029-08-10 US8160640B2 (en) 2007-06-22 2008-06-02 Multi-mode mobile communication device with motion sensor and methods for use therewith
US12/131,522 Active 2032-10-21 US9547080B2 (en) 2007-06-22 2008-06-02 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US12/131,550 Abandoned US20080318625A1 (en) 2007-06-22 2008-06-02 Mobile communication device with gaming mode and methods for use therewith
US12/135,341 Active 2029-06-27 US7952962B2 (en) 2007-06-22 2008-06-09 Directional microphone or microphones for position determination
US12/135,332 Abandoned US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing
US12/136,939 Abandoned US20080318682A1 (en) 2007-06-22 2008-06-11 Dual transceiver gaming console interface and methods for use therewith
US12/137,143 Active 2033-04-24 US9417320B2 (en) 2007-06-22 2008-06-11 Game device that generates a display with a simulated body image and methods for use therewith
US12/137,747 Active 2031-09-01 US8628417B2 (en) 2007-06-22 2008-06-12 Game device with wireless position measurement and methods for use therewith
US12/142,064 Abandoned US20080318683A1 (en) 2007-06-22 2008-06-19 RFID based positioning system
US12/142,032 Active 2030-09-22 US8062133B2 (en) 2007-06-22 2008-06-19 Positioning within a video gaming environment using RF signals
US12/142,733 Abandoned US20080318684A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US12/142,702 Abandoned US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US13/223,121 Expired - Fee Related US8289212B2 (en) 2007-06-22 2011-08-31 Apparatus for position detection using multiple antennas
US13/361,333 Active US8311579B2 (en) 2007-06-22 2012-01-30 Multi-mode mobile communication device with motion sensor and methods for use therewith
US13/592,804 Abandoned US20120315991A1 (en) 2007-06-22 2012-08-23 Apparatus position detection using multiple antennas
US13/627,360 Active US8676257B2 (en) 2007-06-22 2012-09-26 Multi-mode mobile communication device with motion sensor and methods for use therewith
US15/346,418 Active 2028-07-07 US10549195B2 (en) 2007-06-22 2016-11-08 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US15/346,254 Active US9943760B2 (en) 2007-06-22 2016-11-08 Game console and gaming object with motion prediction modeling and methods for use therewith
US16/730,166 Active 2029-03-21 US11426660B2 (en) 2007-06-22 2019-12-30 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Family Applications Before (18)

Application Number Title Priority Date Filing Date
US12/125,154 Abandoned US20090017910A1 (en) 2007-01-31 2008-05-22 Position and motion tracking of an object
US12/128,785 Expired - Fee Related US7973702B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple HCF transmissions
US12/128,810 Expired - Fee Related US8031121B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple antennas
US12/128,797 Abandoned US20080318689A1 (en) 2007-06-22 2008-05-29 Local positioning system and video game applications thereof
US12/131,331 Active 2034-07-30 US9523767B2 (en) 2007-06-22 2008-06-02 Game console and gaming object with motion prediction modeling and methods for use therewith
US12/131,605 Abandoned US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith
US12/131,480 Abandoned US20080318680A1 (en) 2007-06-22 2008-06-02 Gaming object and gaming console that communicate user data via backscattering and methods for use therewith
US12/131,579 Active 2029-08-10 US8160640B2 (en) 2007-06-22 2008-06-02 Multi-mode mobile communication device with motion sensor and methods for use therewith
US12/131,522 Active 2032-10-21 US9547080B2 (en) 2007-06-22 2008-06-02 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US12/131,550 Abandoned US20080318625A1 (en) 2007-06-22 2008-06-02 Mobile communication device with gaming mode and methods for use therewith
US12/135,341 Active 2029-06-27 US7952962B2 (en) 2007-06-22 2008-06-09 Directional microphone or microphones for position determination
US12/135,332 Abandoned US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing
US12/136,939 Abandoned US20080318682A1 (en) 2007-06-22 2008-06-11 Dual transceiver gaming console interface and methods for use therewith
US12/137,143 Active 2033-04-24 US9417320B2 (en) 2007-06-22 2008-06-11 Game device that generates a display with a simulated body image and methods for use therewith
US12/137,747 Active 2031-09-01 US8628417B2 (en) 2007-06-22 2008-06-12 Game device with wireless position measurement and methods for use therewith
US12/142,064 Abandoned US20080318683A1 (en) 2007-06-22 2008-06-19 RFID based positioning system
US12/142,032 Active 2030-09-22 US8062133B2 (en) 2007-06-22 2008-06-19 Positioning within a video gaming environment using RF signals
US12/142,733 Abandoned US20080318684A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques

Family Applications After (7)

Application Number Title Priority Date Filing Date
US13/223,121 Expired - Fee Related US8289212B2 (en) 2007-06-22 2011-08-31 Apparatus for position detection using multiple antennas
US13/361,333 Active US8311579B2 (en) 2007-06-22 2012-01-30 Multi-mode mobile communication device with motion sensor and methods for use therewith
US13/592,804 Abandoned US20120315991A1 (en) 2007-06-22 2012-08-23 Apparatus position detection using multiple antennas
US13/627,360 Active US8676257B2 (en) 2007-06-22 2012-09-26 Multi-mode mobile communication device with motion sensor and methods for use therewith
US15/346,418 Active 2028-07-07 US10549195B2 (en) 2007-06-22 2016-11-08 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US15/346,254 Active US9943760B2 (en) 2007-06-22 2016-11-08 Game console and gaming object with motion prediction modeling and methods for use therewith
US16/730,166 Active 2029-03-21 US11426660B2 (en) 2007-06-22 2019-12-30 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Country Status (1)

Country Link
US (26) US20090017910A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080311851A1 (en) * 2007-06-14 2008-12-18 Hansen Christopher J Method and system for 60 GHZ location determination and coordination of WLAN/WPAN/GPS multimode devices
US20080316863A1 (en) * 2007-06-22 2008-12-25 Broadcom Corporation Directional microphone or microphones for position determination
US20090273340A1 (en) * 2008-05-01 2009-11-05 Cory James Stephanson Self-calibrating magnetic field monitor
US20100109849A1 (en) * 2008-10-30 2010-05-06 Nec (China)Co., Ltd. Multi-objects positioning system and power-control based multiple access control method
US20130162813A1 (en) * 2011-12-22 2013-06-27 Cory J. Stephanson Sensor event assessor training and integration
US8874162B2 (en) 2011-12-23 2014-10-28 Microsoft Corporation Mobile device safe driving
US8902936B2 (en) 2011-12-22 2014-12-02 Cory J. Stephanson Sensor event assessor input/output controller
US20150172567A1 (en) * 2013-12-12 2015-06-18 Flir Systems Ab Orientation-adapted image remote inspection systems and methods
US9230076B2 (en) 2012-08-30 2016-01-05 Microsoft Technology Licensing, Llc Mobile device child share
US9325752B2 (en) 2011-12-23 2016-04-26 Microsoft Technology Licensing, Llc Private interaction hubs
US9363250B2 (en) 2011-12-23 2016-06-07 Microsoft Technology Licensing, Llc Hub coordination service
US9420432B2 (en) 2011-12-23 2016-08-16 Microsoft Technology Licensing, Llc Mobile devices control
US9467834B2 (en) 2011-12-23 2016-10-11 Microsoft Technology Licensing, Llc Mobile device emergency service
US20170017874A1 (en) * 2016-05-06 2017-01-19 Qualcomm Incorporated Radio frequency identification (rfid) reader with frequency adjustment of continuous radio frequency (rf) wave
US9665702B2 (en) 2011-12-23 2017-05-30 Microsoft Technology Licensing, Llc Restricted execution modes
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US20180239419A1 (en) * 2017-02-21 2018-08-23 WiseJet, Inc. Wireless transceiver system using beam tracking
US20190121424A1 (en) * 2017-10-13 2019-04-25 Tactual Labs Co. Backscatter hover detection
US10434396B2 (en) * 2015-11-30 2019-10-08 James Shaunak Divine Protective headgear with display and methods for use therewith
US10613226B2 (en) * 2010-02-24 2020-04-07 Sportsmedia Technology Corporation Tracking system
US20210394068A1 (en) * 2020-06-23 2021-12-23 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions

Families Citing this family (506)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8062126B2 (en) * 2004-01-16 2011-11-22 Sony Computer Entertainment Inc. System and method for interfacing with a computer program
US8915859B1 (en) * 2004-09-28 2014-12-23 Impact Sports Technologies, Inc. Monitoring device, system and method for a multi-player interactive game
US8835616B2 (en) * 2004-12-07 2014-09-16 Lanthiopep B.V. Methods for the production and secretion of modified peptides
US9405372B2 (en) 2006-07-14 2016-08-02 Ailive, Inc. Self-contained inertial navigation system for interactive control using movable controllers
US7702608B1 (en) 2006-07-14 2010-04-20 Ailive, Inc. Generating motion recognizers for arbitrary motions for video games and tuning the motion recognizers to the end user
US7636645B1 (en) 2007-06-18 2009-12-22 Ailive Inc. Self-contained inertial navigation system for interactive control using movable controllers
KR101299682B1 (en) * 2006-10-16 2013-08-22 삼성전자주식회사 Universal input device
US8344949B2 (en) * 2008-03-31 2013-01-01 Golba Llc Wireless positioning approach using time-delay of signals with a known transmission pattern
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8462109B2 (en) 2007-01-05 2013-06-11 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US7934423B2 (en) 2007-12-10 2011-05-03 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US8139945B1 (en) 2007-01-20 2012-03-20 Centrak, Inc. Methods and systems for synchronized infrared real time location
US7636697B1 (en) 2007-01-29 2009-12-22 Ailive Inc. Method and system for rapid evaluation of logical expressions
US20080200224A1 (en) 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
US8907193B2 (en) 2007-02-20 2014-12-09 Ubisoft Entertainment Instrument game system and method
US8284822B2 (en) * 2007-02-27 2012-10-09 Broadcom Corporation Method and system for utilizing direct digital frequency synthesis to process signals in multi-band applications
US20080205545A1 (en) * 2007-02-28 2008-08-28 Ahmadreza Rofougaran Method and System for Using a Phase Locked Loop for Upconversion in a Wideband Crystalless Polar Transmitter
US7826550B2 (en) * 2007-02-28 2010-11-02 Broadcom Corp. Method and system for a high-precision frequency generator using a direct digital frequency synthesizer for transmitters and receivers
US20080205550A1 (en) * 2007-02-28 2008-08-28 Ahmadreza Rofougaran Method and System for Using a Phase Locked Loop for Upconversion in a Wideband Polar Transmitter
US8116387B2 (en) * 2007-03-01 2012-02-14 Broadcom Corporation Method and system for a digital polar transmitter
US20090233710A1 (en) * 2007-03-12 2009-09-17 Roberts Thomas J Feedback gaming peripheral
US7894830B2 (en) * 2007-04-28 2011-02-22 Broadcom Corporation Motion adaptive wireless local area network, wireless communications device and integrated circuits for use therewith
US8064923B2 (en) * 2007-04-28 2011-11-22 Broadcom Corporation Wireless communications device and integrated circuits with global positioning and method for use therewith
JP4438825B2 (en) * 2007-05-29 2010-03-24 ソニー株式会社 Arrival angle estimation system, communication apparatus, and communication system
US20080299906A1 (en) * 2007-06-04 2008-12-04 Topway Electrical Appliance Company Emulating playing apparatus of simulating games
US8238832B1 (en) * 2007-08-28 2012-08-07 Marvell International Ltd. Antenna optimum beam forming for multiple protocol coexistence on a wireless device
US20090076345A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Adherent Device with Multiple Physiological Sensors
WO2009036333A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Dynamic pairing of patients to data collection gateways
WO2009036256A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Injectable physiological monitoring system
EP2194864B1 (en) 2007-09-14 2018-08-29 Medtronic Monitoring, Inc. System and methods for wireless body fluid monitoring
US8591430B2 (en) 2007-09-14 2013-11-26 Corventis, Inc. Adherent device for respiratory monitoring
WO2009036306A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Adherent cardiac monitor with advanced sensing capabilities
US8897868B2 (en) 2007-09-14 2014-11-25 Medtronic, Inc. Medical device automatic start-up upon contact to patient tissue
US8882613B2 (en) * 2007-09-14 2014-11-11 Kitris Ag System for capturing tennis match data
WO2009042190A1 (en) * 2007-09-25 2009-04-02 Wms Gaming Inc. Accessing wagering game services by aiming handheld device at external device
KR101187909B1 (en) * 2007-10-04 2012-10-05 삼성테크윈 주식회사 Surveillance camera system
JP5116424B2 (en) * 2007-10-09 2013-01-09 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
US8206325B1 (en) 2007-10-12 2012-06-26 Biosensics, L.L.C. Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection
WO2009052032A1 (en) * 2007-10-19 2009-04-23 Sony Computer Entertainment America Inc. Scheme for providing audio effects for a musical instrument and for controlling images with same
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
JP5411425B2 (en) * 2007-12-25 2014-02-12 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
US9020780B2 (en) * 2007-12-31 2015-04-28 The Nielsen Company (Us), Llc Motion detector module
NZ586806A (en) * 2008-01-03 2013-03-28 Qualcomm Inc Ultrasonic digitizer and host
US9007178B2 (en) 2008-02-14 2015-04-14 Intermec Ip Corp. Utilization of motion and spatial identification in RFID systems
US8994504B1 (en) 2008-02-14 2015-03-31 Intermec Ip Corp. Utilization of motion and spatial identification in mobile RFID interrogator
US9047522B1 (en) * 2008-02-14 2015-06-02 Intermec Ip Corp. Utilization of motion and spatial identification in mobile RFID interrogator
WO2009114548A1 (en) 2008-03-12 2009-09-17 Corventis, Inc. Heart failure decompensation prediction based on cardiac rhythm
JP5039950B2 (en) * 2008-03-21 2012-10-03 インターナショナル・ビジネス・マシーンズ・コーポレーション Object movement control system, object movement control method, server, and computer program
US20090258703A1 (en) * 2008-04-09 2009-10-15 Aaron Philip Brunstetter Motion Assessment Using a Game Controller
US8412317B2 (en) 2008-04-18 2013-04-02 Corventis, Inc. Method and apparatus to measure bioelectric impedance of patient tissue
JP5115991B2 (en) * 2008-04-30 2013-01-09 独立行政法人産業技術総合研究所 Object state detection apparatus and method
KR101630864B1 (en) * 2008-05-09 2016-06-16 코닌클리케 필립스 엔.브이. Method and system for conveying an emotion
US8242888B2 (en) 2008-06-05 2012-08-14 Keystone Technology Solutions, Llc Systems and methods to determine motion parameters using RFID tags
US8830062B2 (en) * 2008-06-05 2014-09-09 Micron Technology, Inc. Systems and methods to use radar in RFID systems
US8461966B2 (en) 2008-06-05 2013-06-11 Micron Technology, Inc. Systems and methods to determine kinematical parameters using RFID tags
US9844730B1 (en) * 2008-06-16 2017-12-19 Disney Enterprises, Inc. Method and apparatus for an interactive dancing video game
US8483623B2 (en) * 2008-06-19 2013-07-09 Broadcom Corporation Method and system for frequency-shift based PCB-to-PCB communications
GB2461577A (en) 2008-07-04 2010-01-06 Bombardier Transp Gmbh System and method for transferring electric energy to a vehicle
GB2461578A (en) 2008-07-04 2010-01-06 Bombardier Transp Gmbh Transferring electric energy to a vehicle
US8655622B2 (en) * 2008-07-05 2014-02-18 Ailive, Inc. Method and apparatus for interpreting orientation invariant motion
US10679749B2 (en) * 2008-08-22 2020-06-09 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
GB2463692A (en) 2008-09-19 2010-03-24 Bombardier Transp Gmbh An arrangement for providing a vehicle with electric energy
GB2463693A (en) 2008-09-19 2010-03-24 Bombardier Transp Gmbh A system for transferring electric energy to a vehicle
US8724849B2 (en) * 2008-10-01 2014-05-13 Sony Corporation Information processing device, information processing method, program, and information storage medium
US8157609B2 (en) * 2008-10-18 2012-04-17 Mattel, Inc. Mind-control toys and methods of interaction therewith
US7855683B2 (en) * 2008-11-04 2010-12-21 At&T Intellectual Property I, L.P. Methods and apparatuses for GPS coordinates extrapolation when GPS signals are not available
US20100122278A1 (en) * 2008-11-13 2010-05-13 Alfred Xueliang Xin Method and an automated direction following system
US9120016B2 (en) 2008-11-21 2015-09-01 Ubisoft Entertainment Interactive guitar game designed for learning to play the guitar
EP2192696B1 (en) * 2008-11-28 2014-12-31 Sequans Communications Wireless communications method and system with spatial multiplexing using dually polarized antennas and corresponding receiver
US8588805B2 (en) * 2008-12-13 2013-11-19 Broadcom Corporation Receiver utilizing multiple radiation patterns to determine angular position
JP2010152493A (en) 2008-12-24 2010-07-08 Sony Corp Input device, control apparatus, and control method for the input device
US20100177076A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Edge-lit electronic-ink display device for use in indoor and outdoor environments
US8457013B2 (en) 2009-01-13 2013-06-04 Metrologic Instruments, Inc. Wireless dual-function network device dynamically switching and reconfiguring from a wireless network router state of operation into a wireless network coordinator state of operation in a wireless communication network
US20100177749A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Methods of and apparatus for programming and managing diverse network components, including electronic-ink based display devices, in a mesh-type wireless communication network
JP2012515899A (en) 2009-01-27 2012-07-12 エックスワイゼッド・インタラクティヴ・テクノロジーズ・インコーポレーテッド Method and apparatus for ranging detection, orientation determination, and / or positioning of a single device and / or multiple devices
US9652030B2 (en) * 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US20110293144A1 (en) * 2009-02-02 2011-12-01 Agency For Science, Technology And Research Method and System for Rendering an Entertainment Animation
US8254964B2 (en) * 2009-02-23 2012-08-28 Sony Ericsson Mobile Communications Ab Method and arrangement relating to location based services for a communication device
US8311506B2 (en) * 2009-02-26 2012-11-13 Broadcom Corporation RFID receiver front end with phase cancellation and methods for use therewith
KR100999711B1 (en) 2009-03-09 2010-12-08 광주과학기술원 Apparatus for real-time calibrating in the collaboration system and method using the same
JP5287385B2 (en) * 2009-03-13 2013-09-11 オムロン株式会社 Measuring device
US8725156B2 (en) * 2009-04-02 2014-05-13 Honeywell International Inc. Methods for supporting mobile nodes in industrial control and automation systems and other systems and related apparatus
JP2010245796A (en) 2009-04-06 2010-10-28 Sony Corp Video display and method, video display system, and program
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US8953029B2 (en) * 2009-05-08 2015-02-10 Sony Computer Entertainment America Llc Portable device interaction via motion sensitive controller
US8417264B1 (en) * 2009-05-14 2013-04-09 Spring Spectrum L.P. Method and apparatus for determining location of a mobile station based on locations of multiple nearby mobile stations
KR100979623B1 (en) * 2009-05-27 2010-09-01 서울대학교산학협력단 Positioning system and method based on radio communication appratus comprising multiple antenna
US20100304931A1 (en) * 2009-05-27 2010-12-02 Stumpf John F Motion capture system
US9383823B2 (en) * 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
CN101898042B (en) * 2009-05-31 2012-07-18 鸿富锦精密工业(深圳)有限公司 Game controller and control method thereof
US20100332842A1 (en) * 2009-06-30 2010-12-30 Yahoo! Inc. Determining a mood of a user based on biometric characteristic(s) of the user in an online system
US11253781B2 (en) 2009-07-10 2022-02-22 Valve Corporation Player biofeedback for dynamically controlling a video game state
US9511289B2 (en) 2009-07-10 2016-12-06 Valve Corporation Player biofeedback for dynamically controlling a video game state
US8676659B1 (en) * 2009-07-23 2014-03-18 Bank Of America Corporation Methods and apparatuses for facilitating financial transactions using gamer tag information
US20110025464A1 (en) * 2009-07-30 2011-02-03 Awarepoint Corporation Antenna Diversity For Wireless Tracking System And Method
KR20110012584A (en) * 2009-07-31 2011-02-09 삼성전자주식회사 Apparatus and method for estimating position by ultrasonic signal
US20110028218A1 (en) * 2009-08-03 2011-02-03 Realta Entertainment Group Systems and Methods for Wireless Connectivity of a Musical Instrument
CN102022979A (en) * 2009-09-21 2011-04-20 鸿富锦精密工业(深圳)有限公司 Three-dimensional optical sensing system
US8581773B1 (en) * 2009-10-15 2013-11-12 The Boeing Company Dual frequency transmitter
US8790259B2 (en) 2009-10-22 2014-07-29 Corventis, Inc. Method and apparatus for remote detection and monitoring of functional chronotropic incompetence
JP2011090604A (en) * 2009-10-26 2011-05-06 Seiko Epson Corp Optical position detection apparatus and display device with position detection function
JP5493702B2 (en) * 2009-10-26 2014-05-14 セイコーエプソン株式会社 Projection display with position detection function
JP5326989B2 (en) * 2009-10-26 2013-10-30 セイコーエプソン株式会社 Optical position detection device and display device with position detection function
JP2011099994A (en) 2009-11-06 2011-05-19 Seiko Epson Corp Projection display device with position detecting function
US8535133B2 (en) * 2009-11-16 2013-09-17 Broadcom Corporation Video game with controller sensing player inappropriate activity
US8429269B2 (en) * 2009-12-09 2013-04-23 Sony Computer Entertainment Inc. Server-side rendering
US9451897B2 (en) 2009-12-14 2016-09-27 Medtronic Monitoring, Inc. Body adherent patch with electronics for physiologic monitoring
US20110148884A1 (en) * 2009-12-17 2011-06-23 Charles Timberlake Zeleny System and method for determining motion of a subject
US8497902B2 (en) * 2009-12-18 2013-07-30 Sony Computer Entertainment Inc. System for locating a display device using a camera on a portable device and a sensor on a gaming console and method thereof
CN102109594B (en) * 2009-12-28 2014-04-30 深圳富泰宏精密工业有限公司 System and method for sensing and notifying voice
US20110166940A1 (en) * 2010-01-05 2011-07-07 Searete Llc Micro-impulse radar detection of a human demographic and delivery of targeted media content
US20110166937A1 (en) * 2010-01-05 2011-07-07 Searete Llc Media output with micro-impulse radar feedback of physiological response
US9019149B2 (en) 2010-01-05 2015-04-28 The Invention Science Fund I, Llc Method and apparatus for measuring the motion of a person
US8884813B2 (en) * 2010-01-05 2014-11-11 The Invention Science Fund I, Llc Surveillance of stress conditions of persons using micro-impulse radar
US9069067B2 (en) * 2010-09-17 2015-06-30 The Invention Science Fund I, Llc Control of an electronic apparatus using micro-impulse radar
US9024814B2 (en) 2010-01-05 2015-05-05 The Invention Science Fund I, Llc Tracking identities of persons using micro-impulse radar
US8795082B2 (en) * 2010-01-25 2014-08-05 Rambus Inc. Directional beam steering system and method to detect location and motion
US9104238B2 (en) * 2010-02-12 2015-08-11 Broadcom Corporation Systems and methods for providing enhanced motion detection
US8494558B2 (en) 2010-02-23 2013-07-23 Telefonaktiebolaget L M Ericsson (Publ) Communication performance guidance in a user terminal
US8979665B1 (en) 2010-03-22 2015-03-17 Bijan Najafi Providing motion feedback based on user center of mass
US8523667B2 (en) * 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US8965498B2 (en) 2010-04-05 2015-02-24 Corventis, Inc. Method and apparatus for personalized physiologic parameters
EP2407971A4 (en) * 2010-04-09 2012-09-19 Shenzhen Netcom Elect Co Ltd Portable multimedia player
US20110275434A1 (en) * 2010-05-04 2011-11-10 Mediatek Inc. Methods for controlling a process of a game and electronic devices utilizing the same
JP5700758B2 (en) * 2010-05-19 2015-04-15 任天堂株式会社 GAME DEVICE, GAME PROGRAM, GAME SYSTEM, AND GAME PROCESSING METHOD
US8428394B2 (en) 2010-05-25 2013-04-23 Marcus KRIETER System and method for resolving spatial orientation using intelligent optical selectivity
US20110298887A1 (en) * 2010-06-02 2011-12-08 Maglaque Chad L Apparatus Using an Accelerometer to Capture Photographic Images
US10843078B2 (en) * 2010-06-07 2020-11-24 Affectiva, Inc. Affect usage within a gaming context
US8593331B2 (en) * 2010-06-16 2013-11-26 Qualcomm Incorported RF ranging-assisted local motion sensing
US8537847B2 (en) * 2010-06-22 2013-09-17 Sony Corporation Digital clock with internet connectivity and multiple resting orientations
US9167975B1 (en) * 2010-07-28 2015-10-27 Impact Sports Technologies, Inc. Motion resistant device to monitor heart rate in ambulatory patients
US8174934B2 (en) * 2010-07-28 2012-05-08 Empire Technology Development Llc Sound direction detection
US20120212374A1 (en) * 2010-08-17 2012-08-23 Qualcomm Incorporated Method and apparatus for rf-based ranging with multiple antennas
FI122328B (en) * 2010-08-18 2011-12-15 Sauli Hepo-Oja Active localization system
US8613666B2 (en) * 2010-08-31 2013-12-24 Microsoft Corporation User selection and navigation based on looped motions
US20120063270A1 (en) * 2010-09-10 2012-03-15 Pawcatuck, Connecticut Methods and Apparatus for Event Detection and Localization Using a Plurality of Smartphones
US20120064841A1 (en) * 2010-09-10 2012-03-15 Husted Paul J Configuring antenna arrays of mobile wireless devices using motion sensors
US8391334B1 (en) * 2010-09-27 2013-03-05 L-3 Communications Corp Communications reliability in a hub-spoke communications system
US11175375B2 (en) 2010-11-12 2021-11-16 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US10416276B2 (en) 2010-11-12 2019-09-17 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
KR101339431B1 (en) * 2010-11-19 2013-12-09 도시바삼성스토리지테크놀러지코리아 주식회사 Game controller, game machine, and game system employ the game controller
JP5241807B2 (en) * 2010-12-02 2013-07-17 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US8319682B2 (en) * 2011-01-06 2012-11-27 The Boeing Company Method and apparatus for examining an object using electromagnetic millimeter-wave signal illumination
US8753275B2 (en) * 2011-01-13 2014-06-17 BioSensics LLC Intelligent device to monitor and remind patients with footwear, walking aids, braces, or orthotics
WO2012154262A2 (en) 2011-02-21 2012-11-15 TransRobotics, Inc. System and method for sensing distance and/or movement
EP2497546A3 (en) 2011-03-08 2012-10-03 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
JP5792971B2 (en) 2011-03-08 2015-10-14 任天堂株式会社 Information processing system, information processing program, and information processing method
EP2497547B1 (en) 2011-03-08 2018-06-27 Nintendo Co., Ltd. Information processing program, information processing apparatus, information processing system, and information processing method
EP2497543A3 (en) 2011-03-08 2012-10-03 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
US9539511B2 (en) 2011-03-08 2017-01-10 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for operating objects in a virtual world based on orientation data related to an orientation of a device
US9159293B2 (en) * 2011-03-16 2015-10-13 Kyocera Corporation Electronic device, control method, and storage medium storing control program
GB201105587D0 (en) * 2011-04-01 2011-05-18 Elliptic Laboratories As User interfaces for electronic devices
US9000973B2 (en) * 2011-04-29 2015-04-07 The Invention Science Fund I, Llc Personal electronic device with a micro-impulse radar
US8884809B2 (en) * 2011-04-29 2014-11-11 The Invention Science Fund I, Llc Personal electronic device providing enhanced user environmental awareness
US9103899B2 (en) 2011-04-29 2015-08-11 The Invention Science Fund I, Llc Adaptive control of a personal electronic device responsive to a micro-impulse radar
US9151834B2 (en) 2011-04-29 2015-10-06 The Invention Science Fund I, Llc Network and personal electronic devices operatively coupled to micro-impulse radars
WO2012154649A2 (en) * 2011-05-06 2012-11-15 Spencer, Robert B. Artificial touch device for electronic touch screens
JP5937792B2 (en) * 2011-06-03 2016-06-22 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP5869236B2 (en) 2011-06-03 2016-02-24 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US8890684B2 (en) * 2011-06-17 2014-11-18 Checkpoint Systems, Inc. Background object sensor
RU2455676C2 (en) * 2011-07-04 2012-07-10 Общество с ограниченной ответственностью "ТРИДИВИ" Method of controlling device using gestures and 3d sensor for realising said method
US8943396B2 (en) 2011-07-18 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for multi-experience adaptation of media content
US9084001B2 (en) 2011-07-18 2015-07-14 At&T Intellectual Property I, Lp Method and apparatus for multi-experience metadata translation of media content with metadata
KR101893601B1 (en) * 2011-07-22 2018-08-31 삼성전자 주식회사 Input apparatus of display apparatus, display system and control method thereof
US9316731B2 (en) * 2011-08-04 2016-04-19 Lattice Semiconductor Corporation Low-cost tracking system
US8942412B2 (en) 2011-08-11 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for controlling multi-experience translation of media content
US9237362B2 (en) 2011-08-11 2016-01-12 At&T Intellectual Property I, Lp Method and apparatus for multi-experience translation of media content with sensor sharing
US10585472B2 (en) 2011-08-12 2020-03-10 Sony Interactive Entertainment Inc. Wireless head mounted display with differential rendering and sound localization
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
WO2013032222A1 (en) * 2011-08-29 2013-03-07 한국전자통신연구원 Method and system for communicating between devices
US20130050499A1 (en) * 2011-08-30 2013-02-28 Qualcomm Incorporated Indirect tracking
WO2013035096A2 (en) * 2011-09-07 2013-03-14 Umoove Limited System and method of tracking an object in an image captured by a moving device
KR101398709B1 (en) * 2011-09-09 2014-05-28 주식회사 팬택 Terminal apparatus and method for supporting multi interface using user motion
US20130095875A1 (en) * 2011-09-30 2013-04-18 Rami Reuven Antenna selection based on orientation, and related apparatuses, antenna units, methods, and distributed antenna systems
US9449342B2 (en) 2011-10-27 2016-09-20 Ebay Inc. System and method for visualization of items in an environment using augmented reality
WO2013067526A1 (en) 2011-11-04 2013-05-10 Remote TelePointer, LLC Method and system for user interface for interactive devices using a mobile device
WO2013112223A2 (en) * 2011-11-09 2013-08-01 Marquette Trishaun Detection of an asymmetric object
US9933509B2 (en) 2011-11-10 2018-04-03 Position Imaging, Inc. System for tracking an object using pulsed frequency hopping
US9945940B2 (en) 2011-11-10 2018-04-17 Position Imaging, Inc. Systems and methods of wireless position tracking
JP2013153405A (en) * 2011-12-28 2013-08-08 Panasonic Corp Av apparatus and initial setting method thereof
US9295908B2 (en) 2012-01-13 2016-03-29 Igt Canada Solutions Ulc Systems and methods for remote gaming using game recommender
US9129489B2 (en) * 2012-01-13 2015-09-08 Gtech Canada Ulc Remote gaming method where venue's system suggests different games to remote player using a mobile gaming device
US9084932B2 (en) 2012-01-13 2015-07-21 Gtech Canada Ulc Automated discovery of gaming preferences
US9558625B2 (en) 2012-01-13 2017-01-31 Igt Canada Solutions Ulc Systems and methods for recommending games to anonymous players using distributed storage
US9011240B2 (en) * 2012-01-13 2015-04-21 Spielo International Canada Ulc Remote gaming system allowing adjustment of original 3D images for a mobile gaming device
US9558620B2 (en) 2012-01-13 2017-01-31 Igt Canada Solutions Ulc Systems and methods for multi-player remote gaming
US9208641B2 (en) * 2012-01-13 2015-12-08 Igt Canada Solutions Ulc Remote gaming method allowing temporary inactivation without terminating playing session due to game inactivity
US9536378B2 (en) 2012-01-13 2017-01-03 Igt Canada Solutions Ulc Systems and methods for recommending games to registered players using distributed storage
US9269222B2 (en) * 2012-01-13 2016-02-23 Igt Canada Solutions Ulc Remote gaming system using separate terminal to set up remote play with a gaming terminal
US9159189B2 (en) * 2012-01-13 2015-10-13 Gtech Canada Ulc Mobile gaming device carrying out uninterrupted game despite communications link disruption
US9123200B2 (en) * 2012-01-13 2015-09-01 Gtech Canada Ulc Remote gaming using game recommender system and generic mobile gaming device
JP5847852B2 (en) * 2012-01-17 2016-01-27 株式会社ソニー・コンピュータエンタテインメント Server, information processing method, information processing program, and computer-readable recording medium storing information processing program
US9088309B2 (en) * 2012-02-17 2015-07-21 Sony Corporation Antenna tunning arrangement and method
CN104137632B (en) * 2012-02-29 2018-10-26 英特尔公司 Position deviation based on community's correction and track detection corrects
WO2013148986A1 (en) 2012-03-30 2013-10-03 Corning Cable Systems Llc Reducing location-dependent interference in distributed antenna systems operating in multiple-input, multiple-output (mimo) configuration, and related components, systems, and methods
US9857451B2 (en) 2012-04-13 2018-01-02 Qualcomm Incorporated Systems and methods for mapping a source location
US9326689B2 (en) 2012-05-08 2016-05-03 Siemens Medical Solutions Usa, Inc. Thermally tagged motion tracking for medical treatment
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
TW201349032A (en) * 2012-05-23 2013-12-01 Tritan Technology Inc An anti-optical-noise pointer positioning system
US20130321245A1 (en) * 2012-06-04 2013-12-05 Fluor Technologies Corporation Mobile device for monitoring and controlling facility systems
US9213092B2 (en) * 2012-06-12 2015-12-15 Tyco Fire & Security Gmbh Systems and methods for detecting a change in position of an object
US9782669B1 (en) 2012-06-14 2017-10-10 Position Imaging, Inc. RF tracking with active sensory feedback
US10269182B2 (en) 2012-06-14 2019-04-23 Position Imaging, Inc. RF tracking with active sensory feedback
US20140009384A1 (en) * 2012-07-04 2014-01-09 3Divi Methods and systems for determining location of handheld device within 3d environment
US20140028500A1 (en) * 2012-07-30 2014-01-30 Yu-Ming Liu Positioning System
WO2014020921A1 (en) * 2012-07-31 2014-02-06 独立行政法人科学技術振興機構 Device for estimating placement of physical objects
US9519344B1 (en) 2012-08-14 2016-12-13 Position Imaging, Inc. User input system for immersive interaction
US10180490B1 (en) 2012-08-24 2019-01-15 Position Imaging, Inc. Radio frequency communication system
US9454879B2 (en) 2012-09-18 2016-09-27 Igt Canada Solutions Ulc Enhancements to game components in gaming systems
US9754442B2 (en) 2012-09-18 2017-09-05 Igt Canada Solutions Ulc 3D enhanced gaming machine with foreground and background game surfaces
US20140080638A1 (en) * 2012-09-19 2014-03-20 Board Of Regents, The University Of Texas System Systems and methods for providing training and instruction to a football kicker
JP2016502694A (en) * 2012-10-04 2016-01-28 ディズニー エンタープライゼス インコーポレイテッド Interactive objects for immersive environments
US10206610B2 (en) 2012-10-05 2019-02-19 TransRobotics, Inc. Systems and methods for high resolution distance sensing and applications
US9002641B2 (en) 2012-10-05 2015-04-07 Hand Held Products, Inc. Navigation system configured to integrate motion sensing device inputs
US9405011B2 (en) 2012-10-05 2016-08-02 Hand Held Products, Inc. Navigation system configured to integrate motion sensing device inputs
US9477993B2 (en) * 2012-10-14 2016-10-25 Ari M Frank Training a predictor of emotional response based on explicit voting on content and eye tracking to verify attention
WO2014093961A1 (en) 2012-12-15 2014-06-19 Position Imaging, Inc Cycling reference multiplexing receiver system
DE102012224321B4 (en) * 2012-12-21 2022-12-15 Applejack 199 L.P. Measuring device for detecting a hitting movement of a racket, training device and method for training a hitting movement
CA2861244A1 (en) 2012-12-28 2014-06-28 Gtech Canada Ulc Imitating real-world physics in a 3d enhanced gaming machine
US9635605B2 (en) 2013-03-15 2017-04-25 Elwha Llc Protocols for facilitating broader access in wireless communications
US9713013B2 (en) 2013-03-15 2017-07-18 Elwha Llc Protocols for providing wireless communications connectivity maps
US9876762B2 (en) 2012-12-31 2018-01-23 Elwha Llc Cost-effective mobile connectivity protocols
US9451394B2 (en) 2012-12-31 2016-09-20 Elwha Llc Cost-effective mobile connectivity protocols
US9980114B2 (en) 2013-03-15 2018-05-22 Elwha Llc Systems and methods for communication management
US8965288B2 (en) 2012-12-31 2015-02-24 Elwha Llc Cost-effective mobile connectivity protocols
US9781664B2 (en) 2012-12-31 2017-10-03 Elwha Llc Cost-effective mobile connectivity protocols
US9832628B2 (en) 2012-12-31 2017-11-28 Elwha, Llc Cost-effective mobile connectivity protocols
US9119068B1 (en) * 2013-01-09 2015-08-25 Trend Micro Inc. Authentication using geographic location and physical gestures
US9482741B1 (en) 2013-01-18 2016-11-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10856108B2 (en) 2013-01-18 2020-12-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
JP2014153663A (en) * 2013-02-13 2014-08-25 Sony Corp Voice recognition device, voice recognition method and program
US9480911B2 (en) 2013-02-28 2016-11-01 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
WO2014139092A1 (en) * 2013-03-12 2014-09-18 Zheng Shi System and method for interactive board
JP6127602B2 (en) * 2013-03-13 2017-05-17 沖電気工業株式会社 State recognition device, state recognition method, and computer program
US9843917B2 (en) 2013-03-15 2017-12-12 Elwha, Llc Protocols for facilitating charge-authorized connectivity in wireless communications
US9693214B2 (en) 2013-03-15 2017-06-27 Elwha Llc Protocols for facilitating broader access in wireless communications
US9706382B2 (en) 2013-03-15 2017-07-11 Elwha Llc Protocols for allocating communication services cost in wireless communications
US9807582B2 (en) 2013-03-15 2017-10-31 Elwha Llc Protocols for facilitating broader access in wireless communications
US9596584B2 (en) 2013-03-15 2017-03-14 Elwha Llc Protocols for facilitating broader access in wireless communications by conditionally authorizing a charge to an account of a third party
US9706060B2 (en) 2013-03-15 2017-07-11 Elwha Llc Protocols for facilitating broader access in wireless communications
US9813887B2 (en) 2013-03-15 2017-11-07 Elwha Llc Protocols for facilitating broader access in wireless communications responsive to charge authorization statuses
US9866706B2 (en) 2013-03-15 2018-01-09 Elwha Llc Protocols for facilitating broader access in wireless communications
US9781554B2 (en) 2013-03-15 2017-10-03 Elwha Llc Protocols for facilitating third party authorization for a rooted communication device in wireless communications
ITMI20130495A1 (en) * 2013-03-29 2014-09-30 Atlas Copco Blm Srl ELECTRONIC CONTROL AND CONTROL DEVICE FOR SENSORS
US20140302919A1 (en) * 2013-04-05 2014-10-09 Mark J. Ladd Systems and methods for sensor-based mobile gaming
US9311789B1 (en) 2013-04-09 2016-04-12 BioSensics LLC Systems and methods for sensorimotor rehabilitation
US9945939B1 (en) 2013-05-06 2018-04-17 Lokdon Llc Method for determining a location of an emitter
FR3006477B1 (en) * 2013-05-29 2016-09-30 Blinksight DEVICE AND METHOD FOR DETECTING THE HANDLING OF AT LEAST ONE OBJECT
AU2014274970C1 (en) * 2013-06-04 2019-12-05 Isolynx, Llc Object tracking system optimization and tools
US9782670B2 (en) * 2014-04-25 2017-10-10 Ubisoft Entertainment Computer program, method, and system for enabling an interactive event among a plurality of persons
US9372103B2 (en) * 2013-07-12 2016-06-21 Facebook, Inc. Calibration of grab detection
US9891337B2 (en) * 2013-07-15 2018-02-13 SeeScan, Inc. Utility locator transmitter devices, systems, and methods with dockable apparatus
US9128552B2 (en) * 2013-07-17 2015-09-08 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display
US9223340B2 (en) 2013-08-14 2015-12-29 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display
US9669297B1 (en) * 2013-09-18 2017-06-06 Aftershock Services, Inc. Using biometrics to alter game content
US9805208B2 (en) 2013-09-30 2017-10-31 Elwha Llc Mobile device sharing facilitation methods and systems with recipient-dependent inclusion of a data selection
US9740875B2 (en) 2013-09-30 2017-08-22 Elwha Llc Mobile device sharing facilitation methods and systems featuring exclusive data presentation
US9813891B2 (en) 2013-09-30 2017-11-07 Elwha Llc Mobile device sharing facilitation methods and systems featuring a subset-specific source identification
US9826439B2 (en) 2013-09-30 2017-11-21 Elwha Llc Mobile device sharing facilitation methods and systems operable in network equipment
US9774728B2 (en) 2013-09-30 2017-09-26 Elwha Llc Mobile device sharing facilitation methods and systems in a context of plural communication records
US9838536B2 (en) 2013-09-30 2017-12-05 Elwha, Llc Mobile device sharing facilitation methods and systems
JP6796485B2 (en) * 2013-10-09 2020-12-09 マサチューセッツ インスティテュート オブ テクノロジー Motion tracking by wireless reflection of the body
US10063982B2 (en) * 2013-10-09 2018-08-28 Voyetra Turtle Beach, Inc. Method and system for a game headset with audio alerts based on audio track analysis
US9753131B2 (en) 2013-10-09 2017-09-05 Massachusetts Institute Of Technology Motion tracking via body radio reflections
US9616343B2 (en) * 2013-11-18 2017-04-11 Gaming Support B.V. Hybrid gaming platform
US10634761B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
CN106062580B (en) 2013-12-27 2019-11-05 麻省理工学院 Utilize the positioning of non-simultaneous transmission and Multipath Transmission
US9933247B2 (en) 2014-01-13 2018-04-03 The Boeing Company Mandrel configuration monitoring system
US9497728B2 (en) 2014-01-17 2016-11-15 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10200819B2 (en) 2014-02-06 2019-02-05 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US9437002B2 (en) 2014-09-25 2016-09-06 Elwha Llc Systems and methods for a dual modality sensor system
US9739883B2 (en) 2014-05-16 2017-08-22 Elwha Llc Systems and methods for ultrasonic velocity and acceleration detection
US9618618B2 (en) 2014-03-10 2017-04-11 Elwha Llc Systems and methods for ultrasonic position and motion detection
US20150260823A1 (en) * 2014-03-11 2015-09-17 Crestron Electronics, Inc. Method of enclosing and powering a bluetooth emitter
JP2015196091A (en) * 2014-04-02 2015-11-09 アップルジャック 199 エル.ピー. Sensor-based gaming system for avatar to represent player in virtual environment
US9995824B2 (en) * 2014-04-09 2018-06-12 Thomas Danaher Harvey Methods and system to assist search for lost and submerged objects
US10871566B2 (en) * 2014-04-09 2020-12-22 Thomas Danaher Harvey Methods and system to assist search and interception of lost objects
US9885774B2 (en) * 2014-04-18 2018-02-06 Massachusetts Institute Of Technology Indoor localization of a multi-antenna receiver
US10746852B2 (en) 2014-04-28 2020-08-18 Massachusetts Institute Of Technology Vital signs monitoring via radio reflections
US10436888B2 (en) * 2014-05-30 2019-10-08 Texas Tech University System Hybrid FMCW-interferometry radar for positioning and monitoring and methods of using same
US9824524B2 (en) 2014-05-30 2017-11-21 Igt Canada Solutions Ulc Three dimensional enhancements to game components in gaming systems
US10347073B2 (en) 2014-05-30 2019-07-09 Igt Canada Solutions Ulc Systems and methods for three dimensional games in gaming systems
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US20160014390A1 (en) * 2014-07-08 2016-01-14 Apple Inc. Electronic Devices With Connector Alignment Assistance
US10234952B2 (en) * 2014-07-18 2019-03-19 Maxim Integrated Products, Inc. Wearable device for using human body as input mechanism
US9525472B2 (en) 2014-07-30 2016-12-20 Corning Incorporated Reducing location-dependent destructive interference in distributed antenna systems (DASS) operating in multiple-input, multiple-output (MIMO) configuration, and related components, systems, and methods
US9811164B2 (en) * 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US20160073087A1 (en) * 2014-09-10 2016-03-10 Lenovo (Singapore) Pte. Ltd. Augmenting a digital image with distance data derived based on acoustic range information
US9830549B2 (en) * 2014-09-22 2017-11-28 Cosmonet Co., Ltd Data carrier and data carrier system
US9993723B2 (en) * 2014-09-25 2018-06-12 Intel Corporation Techniques for low power monitoring of sports game play
US9600080B2 (en) * 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
CN107003381A (en) 2014-10-07 2017-08-01 Xyz 互动技术公司 For the apparatus and method for orienting and positioning
US9797979B2 (en) 2014-10-08 2017-10-24 Symbol Technologies, Llc System for and method of estimating bearings of radio frequency identification (RFID) tags that return RFID receive signals whose power is below a predetermined threshold
GB2531378B (en) * 2014-10-10 2019-05-08 Zwipe As Power harvesting
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US9423669B2 (en) 2014-11-04 2016-08-23 Qualcomm Incorporated Method and apparatus for camera autofocus based on Wi-Fi ranging technique
US10869175B2 (en) * 2014-11-04 2020-12-15 Nathan Schumacher System and method for generating a three-dimensional model using flowable probes
US9715010B2 (en) * 2014-11-28 2017-07-25 Htc Corporation Apparatus and method for detection
US10609475B2 (en) 2014-12-05 2020-03-31 Stages Llc Active noise control and customized audio system
US11327711B2 (en) 2014-12-05 2022-05-10 Microsoft Technology Licensing, Llc External visual interactions for speech-based devices
US9729267B2 (en) 2014-12-11 2017-08-08 Corning Optical Communications Wireless Ltd Multiplexing two separate optical links with the same wavelength using asymmetric combining and splitting
EP3234752B1 (en) * 2014-12-19 2019-02-20 Abb Ab Automatic configuration system for an operator console
US10275801B2 (en) * 2014-12-19 2019-04-30 Ca, Inc. Adapting user terminal advertisements responsive to measured user behavior
CN104461009B (en) * 2014-12-22 2018-01-09 百度在线网络技术(北京)有限公司 The measuring method and smart machine of object
US10009715B2 (en) * 2015-01-06 2018-06-26 Microsoft Technology Licensing, Llc Geographic information for wireless networks
US10324474B2 (en) 2015-02-13 2019-06-18 Position Imaging, Inc. Spatial diversity for relative position tracking
US11132004B2 (en) 2015-02-13 2021-09-28 Position Imaging, Inc. Spatial diveristy for relative position tracking
US10642560B2 (en) 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
EP3062142B1 (en) 2015-02-26 2018-10-03 Nokia Technologies OY Apparatus for a near-eye display
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US10148918B1 (en) 2015-04-06 2018-12-04 Position Imaging, Inc. Modular shelving systems for package tracking
US11416805B1 (en) 2015-04-06 2022-08-16 Position Imaging, Inc. Light-based guidance for package tracking systems
US10853757B1 (en) 2015-04-06 2020-12-01 Position Imaging, Inc. Video for real-time confirmation in package tracking systems
US11501244B1 (en) 2015-04-06 2022-11-15 Position Imaging, Inc. Package tracking systems and methods
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
RU2583450C1 (en) * 2015-04-14 2016-05-10 Игорь Александрович Маренков Method of locating ground source of radio-frequency of satellite communication system
CN111880650A (en) 2015-04-30 2020-11-03 谷歌有限责任公司 Gesture recognition based on wide field radar
JP6427279B2 (en) 2015-04-30 2018-11-21 グーグル エルエルシー RF based fine motion tracking for gesture tracking and recognition
KR102011992B1 (en) 2015-04-30 2019-08-19 구글 엘엘씨 Type-Agnostic RF Signal Representations
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10251046B2 (en) * 2015-06-01 2019-04-02 Huawei Technologies Co., Ltd. System and method for efficient link discovery in wireless networks
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US9995823B2 (en) 2015-07-31 2018-06-12 Elwha Llc Systems and methods for utilizing compressed sensing in an entertainment system
US10444256B2 (en) * 2015-08-07 2019-10-15 Structural Health Data Systems Device and system for relative motion sensing
US10542222B2 (en) 2015-08-31 2020-01-21 Daniel Arnold Multiview body camera system with environmental sensors and alert features
WO2017040724A1 (en) * 2015-08-31 2017-03-09 Daniel Arnold Multiview body camera system with environmental sensors and alert features
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
JP2018504652A (en) * 2015-10-09 2018-02-15 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Prominent feature based mobile positioning
US9929794B2 (en) * 2015-10-15 2018-03-27 Honeywell International Inc. Long term evolution (LTE) air to ground communication enhancements associated with uplink synchronization
WO2017079484A1 (en) 2015-11-04 2017-05-11 Google Inc. Connectors for connecting electronics embedded in garments to external devices
CN108293267B (en) * 2015-11-30 2021-08-10 索尼移动通讯有限公司 Method and system for improving fairness among devices accessing radio bandwidth
EP3176766B1 (en) * 2015-12-03 2019-07-17 Sony Mobile Communications, Inc. Remote controlling a plurality of controllable devices
WO2017113054A1 (en) * 2015-12-28 2017-07-06 华为技术有限公司 Floor positioning method, device and system
US20170255254A1 (en) * 2016-03-02 2017-09-07 Htc Corporation Tracker device of virtual reality system
US10444323B2 (en) 2016-03-08 2019-10-15 Position Imaging, Inc. Expandable, decentralized position tracking systems and methods
US10362678B2 (en) 2016-04-18 2019-07-23 Skyworks Solutions, Inc. Crystal packaging with conductive pillars
US10062670B2 (en) 2016-04-18 2018-08-28 Skyworks Solutions, Inc. Radio frequency system-in-package with stacked clocking crystal
US10297576B2 (en) 2016-04-18 2019-05-21 Skyworks Solutions, Inc. Reduced form factor radio frequency system-in-package
US10269769B2 (en) 2016-04-18 2019-04-23 Skyworks Solutions, Inc. System in package with vertically arranged radio frequency componentry
WO2017192167A1 (en) 2016-05-03 2017-11-09 Google Llc Connecting an electronic component to an interactive textile
US10285456B2 (en) 2016-05-16 2019-05-14 Google Llc Interactive fabric
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
WO2017200279A1 (en) * 2016-05-17 2017-11-23 Samsung Electronics Co., Ltd. Method and apparatus for facilitating interaction with virtual reality equipment
US10388027B2 (en) * 2016-06-01 2019-08-20 Kyocera Corporation Detection method, display apparatus, and detection system
US11436553B2 (en) 2016-09-08 2022-09-06 Position Imaging, Inc. System and method of object tracking using weight confirmation
US10252812B2 (en) 2016-09-28 2019-04-09 General Electric Company System and method for controlling fuel flow to a gas turbine engine based on motion sensor data
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US9980075B1 (en) 2016-11-18 2018-05-22 Stages Llc Audio source spatialization relative to orientation sensor and output
US10945080B2 (en) 2016-11-18 2021-03-09 Stages Llc Audio analysis and processing system
CN107066121A (en) * 2016-11-30 2017-08-18 黄文超 A kind of magnetic suspension mouse with RFID inductor matrixes
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US10573291B2 (en) 2016-12-09 2020-02-25 The Research Foundation For The State University Of New York Acoustic metamaterial
US10455364B2 (en) 2016-12-12 2019-10-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634503B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634506B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10302020B2 (en) 2016-12-12 2019-05-28 General Electric Company System and method for controlling a fuel flow to a gas turbine engine
US9773330B1 (en) * 2016-12-29 2017-09-26 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10973439B2 (en) 2016-12-29 2021-04-13 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10352962B2 (en) * 2016-12-29 2019-07-16 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis and feedback
US10650552B2 (en) 2016-12-29 2020-05-12 Magic Leap, Inc. Systems and methods for augmented reality
TWI692935B (en) 2016-12-29 2020-05-01 美商天工方案公司 Front end systems and related devices, integrated circuits, modules, and methods
US11318350B2 (en) 2016-12-29 2022-05-03 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
EP4300160A2 (en) 2016-12-30 2024-01-03 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
US11120392B2 (en) 2017-01-06 2021-09-14 Position Imaging, Inc. System and method of calibrating a directional light source relative to a camera's field of view
US10146300B2 (en) * 2017-01-25 2018-12-04 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Emitting a visual indicator from the position of an object in a simulated reality emulation
GB2562452B (en) * 2017-02-14 2020-11-04 Sony Interactive Entertainment Europe Ltd Sensing apparatus and method
CN107016347A (en) * 2017-03-09 2017-08-04 腾讯科技(深圳)有限公司 A kind of body-sensing action identification method, device and system
US10515924B2 (en) 2017-03-10 2019-12-24 Skyworks Solutions, Inc. Radio frequency modules
US20190050060A1 (en) * 2017-03-10 2019-02-14 Awearable Apparel Inc. Methods, systems, and media for providing input based on accelerometer input
GB2578527B (en) * 2017-04-21 2021-04-28 Zenimax Media Inc Player input motion compensation by anticipating motion vectors
WO2018200541A1 (en) 2017-04-24 2018-11-01 Carnegie Mellon University Virtual sensor system
US10754005B2 (en) 2017-05-31 2020-08-25 Google Llc Radar modulation for radar sensing using a wireless communication chipset
US10782390B2 (en) 2017-05-31 2020-09-22 Google Llc Full-duplex operation for radar sensing using wireless communication chipset
US10644397B2 (en) * 2017-06-30 2020-05-05 Intel Corporation Methods, apparatus and systems for motion predictive beamforming
US10578870B2 (en) 2017-07-26 2020-03-03 Magic Leap, Inc. Exit pupil expander
US10726218B2 (en) 2017-07-27 2020-07-28 Symbol Technologies, Llc Method and apparatus for radio frequency identification (RFID) tag bearing estimation
US10989803B1 (en) 2017-08-21 2021-04-27 Massachusetts Institute Of Technology Security protocol for motion tracking systems
WO2019075436A1 (en) * 2017-10-13 2019-04-18 Tactual Labs Co. Minimal driving of transmitters to increase hover detection
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11141160B2 (en) 2017-10-30 2021-10-12 Cilag Gmbh International Clip applier comprising a motor controller
US10959744B2 (en) 2017-10-30 2021-03-30 Ethicon Llc Surgical dissectors and manufacturing techniques
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
WO2019113570A1 (en) 2017-12-10 2019-06-13 Magic Leap, Inc. Anti-reflective coatings on optical waveguides
WO2019126331A1 (en) 2017-12-20 2019-06-27 Magic Leap, Inc. Insert for augmented reality viewing device
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11612408B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Determining tissue composition via an ultrasonic system
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US20190201139A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Communication arrangements for robot-assisted surgical platforms
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US20190201039A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Situational awareness of electrosurgical systems
US10595887B2 (en) 2017-12-28 2020-03-24 Ethicon Llc Systems for adjusting end effector parameters based on perioperative information
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US20190200981A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11678881B2 (en) * 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
GB201802850D0 (en) 2018-02-22 2018-04-11 Sintef Tto As Positioning sound sources
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11589915B2 (en) 2018-03-08 2023-02-28 Cilag Gmbh International In-the-jaw classifier based on a model
US11389188B2 (en) 2018-03-08 2022-07-19 Cilag Gmbh International Start temperature of blade
US10755676B2 (en) 2018-03-15 2020-08-25 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
CN113870457A (en) * 2018-03-22 2021-12-31 创新先进技术有限公司 Timing system, method, device and equipment for competitive sports
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11589865B2 (en) 2018-03-28 2023-02-28 Cilag Gmbh International Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems
US10772511B2 (en) * 2018-05-16 2020-09-15 Qualcomm Incorporated Motion sensor using cross coupling
WO2019232282A1 (en) 2018-05-30 2019-12-05 Magic Leap, Inc. Compact variable focus configurations
EP3803450A4 (en) 2018-05-31 2021-08-18 Magic Leap, Inc. Radar head pose localization
CN112400157A (en) 2018-06-05 2021-02-23 奇跃公司 Homography transformation matrix based temperature calibration of viewing systems
US11092812B2 (en) 2018-06-08 2021-08-17 Magic Leap, Inc. Augmented reality viewer with automated surface selection placement and content orientation placement
WO2020010097A1 (en) 2018-07-02 2020-01-09 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
US11856479B2 (en) 2018-07-03 2023-12-26 Magic Leap, Inc. Systems and methods for virtual and augmented reality along a route with markers
WO2020010226A1 (en) 2018-07-03 2020-01-09 Magic Leap, Inc. Systems and methods for virtual and augmented reality
CN108717180B (en) * 2018-07-05 2021-09-17 南京航空航天大学 Networking radar power distribution method based on Stark-Berger game
EP3827224B1 (en) 2018-07-24 2023-09-06 Magic Leap, Inc. Temperature dependent calibration of movement detection devices
US11624929B2 (en) 2018-07-24 2023-04-11 Magic Leap, Inc. Viewing device with dust seal integration
CN112740665A (en) 2018-08-02 2021-04-30 奇跃公司 Observation system for interpupillary distance compensation based on head movement
EP3830631A4 (en) 2018-08-03 2021-10-27 Magic Leap, Inc. Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
US11003205B2 (en) 2019-02-04 2021-05-11 Sigmasense, Llc. Receive analog to digital circuit of a low voltage drive circuit data communication system
US10499363B1 (en) * 2018-09-18 2019-12-03 Qualcomm Incorporated Methods and apparatus for improved accuracy and positioning estimates
US11361536B2 (en) 2018-09-21 2022-06-14 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method
JP7201379B2 (en) * 2018-10-02 2023-01-10 東芝テック株式会社 RFID tag reader
US11580316B2 (en) * 2018-11-08 2023-02-14 Avery Dennison Retail Information Services Llc Interacting RFID tags
JP2022509770A (en) 2018-11-16 2022-01-24 マジック リープ, インコーポレイテッド Clarification triggered by image size to maintain image sharpness
US20200168045A1 (en) 2018-11-28 2020-05-28 Igt Dynamic game flow modification in electronic wagering games
CN109557512B (en) * 2018-12-06 2020-08-04 航天南湖电子信息技术股份有限公司 Radar receiver with high sensitivity and high dynamic range
EP3668197B1 (en) * 2018-12-12 2021-11-03 Rohde & Schwarz GmbH & Co. KG Method and radio for setting the transmission power of a radio transmission
WO2020146861A1 (en) 2019-01-11 2020-07-16 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
US11425189B2 (en) 2019-02-06 2022-08-23 Magic Leap, Inc. Target intent-based clock speed determination and adjustment to limit total heat generated by multiple processors
US10977808B2 (en) * 2019-02-18 2021-04-13 Raytheon Company Three-frame difference target acquisition and tracking using overlapping target images
US11751872B2 (en) 2019-02-19 2023-09-12 Cilag Gmbh International Insertable deactivator element for surgical stapler lockouts
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11331100B2 (en) 2019-02-19 2022-05-17 Cilag Gmbh International Staple cartridge retainer system with authentication keys
WO2020185405A1 (en) 2019-03-12 2020-09-17 Magic Leap, Inc. Registration of local content between first and second augmented reality viewers
US11703593B2 (en) 2019-04-04 2023-07-18 TransRobotics, Inc. Technologies for acting based on object tracking
EP3963565A4 (en) * 2019-05-01 2022-10-12 Magic Leap, Inc. Content provisioning system and method
EP3928181A1 (en) 2019-06-17 2021-12-29 Google LLC Mobile device-based radar system for applying different power modes to a multi-mode interface
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
CN114174895A (en) 2019-07-26 2022-03-11 奇跃公司 System and method for augmented reality
GB2586059B (en) * 2019-08-01 2023-06-07 Sony Interactive Entertainment Inc System and method for generating user inputs for a video game
US10973062B2 (en) * 2019-08-26 2021-04-06 International Business Machines Corporation Method for extracting environment information leveraging directional communication
KR20210034270A (en) 2019-09-20 2021-03-30 삼성전자주식회사 Electronic device for determinig path of line of sight(los) and method for the same
US11737832B2 (en) 2019-11-15 2023-08-29 Magic Leap, Inc. Viewing system for use in a surgical environment
KR20210069479A (en) 2019-12-03 2021-06-11 삼성전자주식회사 Electronic device and operating method for identifying location information of device
US11860439B1 (en) 2020-05-06 2024-01-02 Apple Inc. Head-mounted electronic device with alignment sensors
CN112418200B (en) * 2021-01-25 2021-04-02 成都点泽智能科技有限公司 Object detection method and device based on thermal imaging and server
CN113129328B (en) * 2021-04-22 2022-05-17 中国电子科技集团公司第二十九研究所 Target hotspot area fine analysis method
US11640725B2 (en) 2021-05-28 2023-05-02 Sportsbox.ai Inc. Quantitative, biomechanical-based analysis with outcomes and context
GB2608186A (en) * 2021-06-25 2022-12-28 Thermoteknix Systems Ltd Augmented reality system
US11920521B2 (en) 2022-02-07 2024-03-05 General Electric Company Turboshaft load control using feedforward and feedback control

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040017313A1 (en) * 2002-03-12 2004-01-29 Alberto Menache Motion tracking system and method
US20060274911A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device with sound emitter for use in obtaining information for controlling game program execution

Family Cites Families (238)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2397746A (en) * 1942-05-23 1946-04-02 Hazeltine Corp Wave-signal direction finder
US3430243A (en) * 1966-04-04 1969-02-25 Robley D Evans Method of and apparatus for determining the distance and/or angles between objects with the aid of radiant energy
US3816830A (en) * 1970-11-27 1974-06-11 Hazeltine Corp Cylindrical array antenna
US3789410A (en) * 1972-01-07 1974-01-29 Us Navy Passive ranging technique
US4041494A (en) * 1975-11-10 1977-08-09 The United States Of America As Represented By The Secretary Of The Department Of Transportation Distance measuring method and apparatus
US4309703A (en) * 1979-12-28 1982-01-05 International Business Machines Corporation Segmented chirp waveform implemented radar system
US5248884A (en) * 1983-10-11 1993-09-28 The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland Infrared detectors
US4639900A (en) * 1984-02-22 1987-01-27 U.S. Philips Corporation Method and a system for monitoring a sea area
US4807183A (en) * 1985-09-27 1989-02-21 Carnegie-Mellon University Programmable interconnection chip for computer system functional modules
AU8237987A (en) * 1986-11-27 1988-06-16 Starpeak Computers Limited Locating system
US5027433A (en) * 1988-04-04 1991-06-25 Hm Electronics, Inc. Remote infrared transceiver and method of using same
US5214615A (en) * 1990-02-26 1993-05-25 Will Bauer Three-dimensional displacement of a body with computer interface
US5229764A (en) * 1991-06-20 1993-07-20 Matchett Noel D Continuous biometric authentication matrix
US5138322A (en) * 1991-08-20 1992-08-11 Matrix Engineering, Inc. Method and apparatus for radar measurement of ball in play
US7098891B1 (en) * 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US5502683A (en) * 1993-04-20 1996-03-26 International Business Machines Corporation Dual ported memory with word line access control
WO1994026075A1 (en) * 1993-05-03 1994-11-10 The University Of British Columbia Tracking platform system
JP2552427B2 (en) * 1993-12-28 1996-11-13 コナミ株式会社 Tv play system
US5574479A (en) * 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
CA2141144A1 (en) * 1994-03-31 1995-10-01 Joseph Desimone Electronic game utilizing bio-signals
US5412619A (en) * 1994-04-14 1995-05-02 Bauer; Will Three-dimensional displacement of a body with computer interface
US6539336B1 (en) * 1996-12-12 2003-03-25 Phatrat Technologies, Inc. Sport monitoring system for determining airtime, speed, power absorbed and other factors such as drop distance
US8280682B2 (en) * 2000-12-15 2012-10-02 Tvipr, Llc Device for monitoring movement of shipped goods
US5943427A (en) * 1995-04-21 1999-08-24 Creative Technology Ltd. Method and apparatus for three dimensional audio spatialization
US6418324B1 (en) * 1995-06-01 2002-07-09 Padcom, Incorporated Apparatus and method for transparent wireless communication between a remote device and host system
US5528557A (en) * 1995-08-07 1996-06-18 Northrop Grumman Corporation Acoustic emission source location by reverse ray tracing
US5742840A (en) * 1995-08-16 1998-04-21 Microunity Systems Engineering, Inc. General purpose, multiple precision parallel operation, programmable media processor
US6087979A (en) * 1995-09-07 2000-07-11 Siemens Aktiengesellschaft Rangefinder
US6278418B1 (en) * 1995-12-29 2001-08-21 Kabushiki Kaisha Sega Enterprises Three-dimensional imaging system, game device, method for same and recording medium
US5754948A (en) * 1995-12-29 1998-05-19 University Of North Carolina At Charlotte Millimeter-wave wireless interconnection of electronic components
US6396041B1 (en) * 1998-08-21 2002-05-28 Curtis A. Vock Teaching and gaming golf feedback system and methods
US5700204A (en) * 1996-06-17 1997-12-23 Teder; Rein S. Projectile motion parameter determination device using successive approximation and high measurement angle speed sensor
US6400374B2 (en) * 1996-09-18 2002-06-04 Eyematic Interfaces, Inc. Video superposition system and method
US5786912A (en) * 1996-12-27 1998-07-28 Lucent Technologies Inc. Waveguide-based, fabricless switch for telecommunication system and telecommunication infrastructure employing the same
US6182203B1 (en) * 1997-01-24 2001-01-30 Texas Instruments Incorporated Microprocessor
US6814293B2 (en) * 1997-02-10 2004-11-09 Symbol Technologies, Inc. Arrangement for and method of establishing a logical relationship among peripherals in a wireless local area network
JP4121560B2 (en) * 1997-02-13 2008-07-23 ノキア コーポレイション Directional wireless communication method and apparatus
US6070269A (en) * 1997-07-25 2000-06-06 Medialab Services S.A. Data-suit for real-time computer animation and virtual reality applications
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US6142876A (en) * 1997-08-22 2000-11-07 Cumbers; Blake Player tracking and identification system
US6162123A (en) * 1997-11-25 2000-12-19 Woolston; Thomas G. Interactive electronic sword game
US5884104A (en) * 1997-11-26 1999-03-16 Eastman Kodak Company Compact camera flash unit
US6421453B1 (en) * 1998-05-15 2002-07-16 International Business Machines Corporation Apparatus and methods for user recognition employing behavioral passwords
US6438622B1 (en) * 1998-11-17 2002-08-20 Intel Corporation Multiprocessor system including a docking system
FR2786899B1 (en) * 1998-12-03 2006-09-29 Jean Bonnard MOVEMENT INDICATOR FOR SOFTWARE
US7102616B1 (en) * 1999-03-05 2006-09-05 Microsoft Corporation Remote control device with pointing capacity
US7933295B2 (en) * 1999-04-13 2011-04-26 Broadcom Corporation Cable modem with voice processing capability
US7015950B1 (en) * 1999-05-11 2006-03-21 Pryor Timothy R Picture taking method and apparatus
US6343315B1 (en) * 1999-05-12 2002-01-29 Lodgenet Entertainment Corporation Entertainment/Information system having disparate interactive devices
US6653971B1 (en) * 1999-05-14 2003-11-25 David L. Guice Airborne biota monitoring and control system
US6500070B1 (en) * 1999-05-28 2002-12-31 Nintendo Co., Ltd. Combined game system of portable and video game machines
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6177903B1 (en) * 1999-06-14 2001-01-23 Time Domain Corporation System and method for intrusion detection using a time domain radar array
US7592944B2 (en) * 1999-06-14 2009-09-22 Time Domain Corporation System and method for intrusion detection using a time domain radar array
JP4278071B2 (en) * 1999-06-17 2009-06-10 株式会社バンダイナムコゲームス Image generation system and information storage medium
US6545661B1 (en) * 1999-06-21 2003-04-08 Midway Amusement Games, Llc Video game system having a control unit with an accelerometer for controlling a video game
JP2001104636A (en) * 1999-10-04 2001-04-17 Shinsedai Kk Cenesthesic ball game device
US7030860B1 (en) * 1999-10-08 2006-04-18 Synaptics Incorporated Flexible transparent touch sensing system for electronic devices
US6735708B2 (en) * 1999-10-08 2004-05-11 Dell Usa, L.P. Apparatus and method for a combination personal digital assistant and network portable device
US8956228B2 (en) * 1999-12-03 2015-02-17 Nike, Inc. Game pod
US7010634B2 (en) * 1999-12-23 2006-03-07 Intel Corporation Notebook computer with independently functional, dockable core computer
US7445550B2 (en) * 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US6315667B1 (en) * 2000-03-28 2001-11-13 Robert Steinhart System for remote control of a model airplane
JP4020567B2 (en) * 2000-05-15 2007-12-12 株式会社コナミデジタルエンタテインメント Game machine and game environment setting network system thereof
US20020049806A1 (en) * 2000-05-16 2002-04-25 Scott Gatz Parental control system for use in connection with account-based internet access server
EP1168158B1 (en) * 2000-06-12 2007-10-10 Broadcom Corporation Context switch architecture and system
JP2002052243A (en) * 2000-08-11 2002-02-19 Konami Co Ltd Competition type video game
US7918808B2 (en) * 2000-09-20 2011-04-05 Simmons John C Assistive clothing
KR100364368B1 (en) * 2000-10-18 2002-12-12 엘지전자 주식회사 Private Network Using Bluetooth and Communication Method Using the Network
JP2002171245A (en) * 2000-12-05 2002-06-14 Sony Corp Method for synthesizing retransmitted data and device for synthesizing retransmitted data
US6735663B2 (en) * 2000-12-18 2004-05-11 Dell Products L.P. Combination personal data assistant and personal computing device
EP1216899A1 (en) * 2000-12-22 2002-06-26 Ford Global Technologies, Inc. Communication system for use with a vehicle
JP2002199500A (en) * 2000-12-25 2002-07-12 Sony Corp Virtual sound image localizing processor, virtual sound image localization processing method and recording medium
JP3722279B2 (en) * 2001-01-26 2005-11-30 日本電気株式会社 Optical transceiver module
US6816925B2 (en) * 2001-01-26 2004-11-09 Dell Products L.P. Combination personal data assistant and personal computing device with master slave input output
US7197584B2 (en) * 2001-01-26 2007-03-27 Dell Products L.P. Removable personal digital assistant in a dual personal computer/personal digital assistant computer architecture
US6801974B1 (en) * 2001-01-26 2004-10-05 Dell Products L.P. Method of filtering events in a combinational computing device
EP1274279B1 (en) * 2001-02-14 2014-06-18 Sony Corporation Sound image localization signal processor
US7131907B2 (en) * 2001-02-22 2006-11-07 Kabushiki Kaisha Sega System and method for superimposing an image on another image in a video game
US20020183038A1 (en) * 2001-05-31 2002-12-05 Palm, Inc. System and method for crediting an account associated with a network access node
US7082285B2 (en) * 2001-03-23 2006-07-25 Broadcom Corporation Reduced instruction set baseband controller
US6540607B2 (en) * 2001-04-26 2003-04-01 Midway Games West Video game position and orientation detection system
US7065326B2 (en) * 2001-05-02 2006-06-20 Trex Enterprises Corporation Millimeter wave communications system with a high performance modulator circuit
US6587699B2 (en) * 2001-05-02 2003-07-01 Trex Enterprises Corporation Narrow beamwidth communication link with alignment camera
US20040174431A1 (en) * 2001-05-14 2004-09-09 Stienstra Marcelle Andrea Device for interacting with real-time streams of content
US6563940B2 (en) * 2001-05-16 2003-05-13 New Jersey Institute Of Technology Unauthorized user prevention device and method
SE523407C2 (en) * 2001-05-18 2004-04-13 Jan G Faeger Device for determining the position and / or orientation of a creature in relation to an environment and use of such a device
US20030172380A1 (en) * 2001-06-05 2003-09-11 Dan Kikinis Audio command and response for IPGs
US20030001882A1 (en) * 2001-06-29 2003-01-02 Macer Peter J. Portable entertainment machines
DE10136981A1 (en) * 2001-07-30 2003-02-27 Daimler Chrysler Ag Method and device for determining a stationary and / or moving object
US7094164B2 (en) * 2001-09-12 2006-08-22 Pillar Vision Corporation Trajectory detection and feedback system
US6760387B2 (en) * 2001-09-21 2004-07-06 Time Domain Corp. Impulse radio receiver and method for finding angular offset of an impulse radio transmitter
US7054423B2 (en) * 2001-09-24 2006-05-30 Nebiker Robert M Multi-media communication downloading
US6937182B2 (en) * 2001-09-28 2005-08-30 Trex Enterprises Corp. Millimeter wave imaging system
US7257093B1 (en) * 2001-10-10 2007-08-14 Sandia Corporation Localized radio frequency communication using asynchronous transfer mode protocol
US6987988B2 (en) * 2001-10-22 2006-01-17 Waxess, Inc. Cordless and wireless telephone docking station with land line interface and switching mode
US7444393B2 (en) * 2001-10-30 2008-10-28 Keicy K. Chung Read-only storage device having network interface, a system including the device, and a method of distributing files over a network
US20050282633A1 (en) * 2001-11-13 2005-12-22 Frederic Nicolas Movement-sensing apparatus for software
US20030112585A1 (en) * 2001-12-13 2003-06-19 Silvester Kelan Craig Multiprocessor notebook computer with a tablet PC conversion capability
US6712692B2 (en) * 2002-01-03 2004-03-30 International Business Machines Corporation Using existing videogames for physical training and rehabilitation
JP3914771B2 (en) * 2002-01-09 2007-05-16 株式会社日立製作所 Packet communication apparatus and packet data transfer control method
GB0203621D0 (en) * 2002-02-15 2002-04-03 Bae Systems Defence Sysytems L Emitter location system
WO2003071813A2 (en) * 2002-02-19 2003-08-28 Zyray Wireless, Inc. Method and apparatus optimizing a radio link
US6990320B2 (en) * 2002-02-26 2006-01-24 Motorola, Inc. Dynamic reallocation of processing resources for redundant functionality
KR100449102B1 (en) * 2002-03-19 2004-09-18 삼성전자주식회사 System on chip processor for multimedia
US20030195040A1 (en) * 2002-04-10 2003-10-16 Breving Joel S. Video game system and game controller
US20030211888A1 (en) * 2002-05-13 2003-11-13 Interactive Telegames, Llc Method and apparatus using insertably-removable auxiliary devices to play games over a communications link
US7085536B2 (en) * 2002-05-23 2006-08-01 Intel Corporation Method and apparatus for dynamically resolving radio frequency interference problems in a system
US7043588B2 (en) * 2002-05-24 2006-05-09 Dell Products L.P. Information handling system featuring multi-processor capability with processor located in docking station
US20050035955A1 (en) * 2002-06-06 2005-02-17 Carter Dale J. Method of determining orientation and manner of holding a mobile telephone
US7146014B2 (en) * 2002-06-11 2006-12-05 Intel Corporation MEMS directional sensor system
US7159099B2 (en) * 2002-06-28 2007-01-02 Motorola, Inc. Streaming vector processor with reconfigurable interconnection switch
US7161579B2 (en) * 2002-07-18 2007-01-09 Sony Computer Entertainment Inc. Hand-held computer interactive device
US7646372B2 (en) * 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US8073157B2 (en) * 2003-08-27 2011-12-06 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US8947347B2 (en) * 2003-08-27 2015-02-03 Sony Computer Entertainment Inc. Controlling actions in a video game unit
US7623115B2 (en) * 2002-07-27 2009-11-24 Sony Computer Entertainment Inc. Method and apparatus for light input device
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US7854655B2 (en) * 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US7918733B2 (en) * 2002-07-27 2011-04-05 Sony Computer Entertainment America Inc. Multi-input game control mixer
US7782297B2 (en) * 2002-07-27 2010-08-24 Sony Computer Entertainment America Inc. Method and apparatus for use in determining an activity level of a user in relation to a system
DE10240497A1 (en) * 2002-09-03 2004-03-11 Robert Bosch Gmbh Radar measuring device and method for operating a radar measuring device
US20040054776A1 (en) * 2002-09-16 2004-03-18 Finisar Corporation Network expert analysis process
US7343524B2 (en) * 2002-09-16 2008-03-11 Finisar Corporation Network analysis omniscent loop state machine
US20040062308A1 (en) * 2002-09-27 2004-04-01 Kamosa Gregg Mark System and method for accelerating video data processing
US7200061B2 (en) * 2002-11-08 2007-04-03 Hitachi, Ltd. Sense amplifier for semiconductor memory device
US20040117442A1 (en) * 2002-12-10 2004-06-17 Thielen Kurt R. Handheld portable wireless digital content player
US20040123113A1 (en) * 2002-12-18 2004-06-24 Svein Mathiassen Portable or embedded access and input devices and methods for giving access to access limited devices, apparatuses, appliances, systems or networks
AU2003297389A1 (en) * 2002-12-19 2004-07-14 Fortescue Corporation Method and apparatus for determining orientation and position of a moveable object
US7339608B2 (en) * 2003-01-03 2008-03-04 Vtech Telecommunications Limited Wireless motion sensor using infrared illuminator and camera integrated with wireless telephone
JP3875196B2 (en) * 2003-02-10 2007-01-31 株式会社東芝 Service providing device, service receiving device, service providing program, service receiving program, proximity wireless communication device, service providing method, and service receiving method
US7009561B2 (en) * 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
TWI231668B (en) * 2003-03-12 2005-04-21 Nec Corp Transmission beam control method, adaptive antenna transmitter/receiver apparatus and radio base station
JP4336366B2 (en) * 2003-03-12 2009-09-30 インターナショナル・ビジネス・マシーンズ・コーポレーション [Ultra Wideband] Method and apparatus enabling use of application and medium access protocol for wireless optical link over radio channel
AU2003901463A0 (en) * 2003-03-31 2003-04-17 Qx Corporation Pty Ltd A method and device for multipath mitigation in positioning systems using clustered positioning signals
CA2523480C (en) * 2003-04-25 2014-05-27 Xm Satellite Radio Inc. System and method for providing recording and playback of digital media content
US7391888B2 (en) * 2003-05-30 2008-06-24 Microsoft Corporation Head pose assessment methods and systems
US20050009604A1 (en) * 2003-07-11 2005-01-13 Hsien-Ta Huang Monotone voice activation device
US20050014468A1 (en) * 2003-07-18 2005-01-20 Juha Salokannel Scalable bluetooth multi-mode radio module
US7432846B2 (en) * 2003-08-12 2008-10-07 Trex Enterprises Corp. Millimeter wave imaging system
US7415244B2 (en) * 2003-08-12 2008-08-19 Trey Enterprises Corp. Multi-channel millimeter wave imaging system
US7385549B2 (en) * 2003-08-12 2008-06-10 Trex Enterprises Corp Millimeter wave portal imaging system
EP1659629B1 (en) * 2003-08-28 2011-05-04 Hitachi, Ltd. Semiconductor device and its manufacturing method
US7441154B2 (en) * 2003-09-12 2008-10-21 Finisar Corporation Network analysis tool
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US20050076161A1 (en) * 2003-10-03 2005-04-07 Amro Albanna Input system and method
US20050124307A1 (en) * 2003-12-08 2005-06-09 Xytrans, Inc. Low cost broadband wireless communication system
US20050132420A1 (en) * 2003-12-11 2005-06-16 Quadrock Communications, Inc System and method for interaction with television content
US20070160216A1 (en) * 2003-12-15 2007-07-12 France Telecom Acoustic synthesis and spatialization method
US20050185364A1 (en) * 2004-01-05 2005-08-25 Jory Bell Docking station for mobile computing device
US7663689B2 (en) * 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
TWI286036B (en) * 2004-02-10 2007-08-21 Realtek Semiconductor Corp Method for selecting a channel in a wireless network
US7148836B2 (en) * 2004-03-05 2006-12-12 The Regents Of The University Of California Obstacle penetrating dynamic radar imaging system
US9178953B2 (en) * 2004-03-18 2015-11-03 Nokia Technologies Oy Position-based context awareness for mobile terminal device
JP2005323340A (en) * 2004-04-07 2005-11-17 Matsushita Electric Ind Co Ltd Communication terminal and communication method
US20050245204A1 (en) * 2004-05-03 2005-11-03 Vance Scott L Impedance matching circuit for a mobile communication device
JP3866735B2 (en) * 2004-05-10 2007-01-10 株式会社東芝 Multifunction mobile communication terminal
US7671916B2 (en) * 2004-06-04 2010-03-02 Electronic Arts Inc. Motion sensor using dual camera inputs
US8027165B2 (en) * 2004-07-08 2011-09-27 Sandisk Technologies Inc. Portable memory devices with removable caps that effect operation of the devices when attached
US8016667B2 (en) * 2004-07-22 2011-09-13 Igt Remote gaming eligibility system and method using RFID tags
US8109858B2 (en) * 2004-07-28 2012-02-07 William G Redmann Device and method for exercise prescription, detection of successful performance, and provision of reward therefore
US7242359B2 (en) * 2004-08-18 2007-07-10 Microsoft Corporation Parallel loop antennas for a mobile electronic device
KR100890060B1 (en) * 2004-08-27 2009-03-25 삼성전자주식회사 System and Method for Controlling Congestion of Group Call Response Message On Access Channel
US7590589B2 (en) * 2004-09-10 2009-09-15 Hoffberg Steven M Game theoretic prioritization scheme for mobile ad hoc networks permitting hierarchal deference
US7702876B2 (en) * 2004-09-22 2010-04-20 Xyratex Technology Limited System and method for configuring memory devices for use in a network
EP1646112A1 (en) * 2004-10-11 2006-04-12 Sony Deutschland GmbH Directivity control for short range wireless mobile communication systems
US20060085675A1 (en) * 2004-10-12 2006-04-20 Andrew Popell One-touch backup system
US6967612B1 (en) * 2004-10-22 2005-11-22 Gorman John D System and method for standoff detection of human carried explosives
US7683883B2 (en) * 2004-11-02 2010-03-23 Pierre Touma 3D mouse and game controller based on spherical coordinates system and system for use
JP5004276B2 (en) * 2004-11-16 2012-08-22 学校法人日本大学 Sound source direction determination apparatus and method
US6965340B1 (en) * 2004-11-24 2005-11-15 Agilent Technologies, Inc. System and method for security inspection using microwave imaging
US20060148568A1 (en) * 2004-12-30 2006-07-06 Motorola, Inc. Device and method for wirelessly accessing game media
US7864159B2 (en) * 2005-01-12 2011-01-04 Thinkoptics, Inc. Handheld vision based absolute pointing system
JP2008528195A (en) * 2005-01-26 2008-07-31 ベントレー・キネティクス・インコーポレーテッド Method and system for analyzing and indicating motor movement
US20060189386A1 (en) * 2005-01-28 2006-08-24 Outland Research, L.L.C. Device, system and method for outdoor computer gaming
US20070055949A1 (en) * 2005-01-29 2007-03-08 Nicholas Thomas Methods and apparatus for rfid interface control
US7330702B2 (en) * 2005-01-31 2008-02-12 Taiwan Semiconductor Manufacturing Co., Ltd. Method and apparatus for inter-chip wireless communication
EP1688847B1 (en) * 2005-02-03 2011-05-04 Texas Instruments Incorporated Die-to-die interconnect interface and protocol for stacked semiconductor dies
US7502965B2 (en) * 2005-02-07 2009-03-10 Broadcom Corporation Computer chip set having on board wireless interfaces to support test operations
US7489870B2 (en) * 2005-10-31 2009-02-10 Searete Llc Optical antenna with optical reference
US20060203758A1 (en) * 2005-03-11 2006-09-14 Samsung Electronics Co., Ltd. Mobile terminal for relaying multimedia data to an external display device
US20060211494A1 (en) * 2005-03-18 2006-09-21 Helfer Lisa M Gaming terminal with player-customization of display functions
US7343177B2 (en) * 2005-05-03 2008-03-11 Broadcom Corporation Modular ear-piece/microphone (headset) operable to service voice activated commands
US20060256090A1 (en) * 2005-05-12 2006-11-16 Apple Computer, Inc. Mechanical overlay
US7733285B2 (en) * 2005-05-18 2010-06-08 Qualcomm Incorporated Integrated, closely spaced, high isolation, printed dipoles
US8116401B2 (en) * 2005-05-26 2012-02-14 Broadcom Corporation Method and system for digital spur cancellation
US8001353B2 (en) * 2005-06-10 2011-08-16 Hewlett-Packard Development Company, L.P. Apparatus and method for configuring memory blocks
US7218143B1 (en) * 2005-06-14 2007-05-15 Xilinx, Inc. Integrated circuit having fast interconnect paths between memory elements and carry logic
KR101257848B1 (en) * 2005-07-13 2013-04-24 삼성전자주식회사 Data storing apparatus comprising complex memory and method of operating the same
GB0515796D0 (en) * 2005-07-30 2005-09-07 Mccarthy Peter A motion capture and identification device
US7927216B2 (en) * 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
JP4262726B2 (en) * 2005-08-24 2009-05-13 任天堂株式会社 Game controller and game system
US8308563B2 (en) * 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
JP4471910B2 (en) * 2005-09-14 2010-06-02 任天堂株式会社 Virtual positioning program
US8471812B2 (en) * 2005-09-23 2013-06-25 Jesse C. Bunch Pointing and identification device
US8142287B2 (en) * 2005-10-11 2012-03-27 Zeemote Technology Inc. Universal controller for toys and games
JP4859433B2 (en) * 2005-10-12 2012-01-25 任天堂株式会社 Position detection system and position detection program
US7715432B2 (en) * 2005-11-14 2010-05-11 Broadcom Corporation Primary protocol stack having a secondary protocol stack entry point
US8180363B2 (en) * 2005-11-15 2012-05-15 Sony Computer Entertainment Inc. Communication apparatus preventing communication interference
US7613482B2 (en) * 2005-12-08 2009-11-03 Accton Technology Corporation Method and system for steering antenna beam
US7170440B1 (en) * 2005-12-10 2007-01-30 Landray Technology, Inc. Linear FM radar
US20070135243A1 (en) * 2005-12-12 2007-06-14 Larue Michael B Active sports tracker and method
TWI286484B (en) * 2005-12-16 2007-09-11 Pixart Imaging Inc Device for tracking the motion of an object and object for reflecting infrared light
US7602301B1 (en) * 2006-01-09 2009-10-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US7714780B2 (en) * 2006-03-10 2010-05-11 Broadcom Corporation Beamforming RF circuit and applications thereof
JP4151982B2 (en) * 2006-03-10 2008-09-17 任天堂株式会社 Motion discrimination device and motion discrimination program
US20070224944A1 (en) * 2006-03-10 2007-09-27 Hsiang Chen Portable device having changeable operating modes
US7899394B2 (en) * 2006-03-16 2011-03-01 Broadcom Corporation RFID system with RF bus
US7423587B2 (en) * 2006-04-02 2008-09-09 Rolf Mueller Method for frequency-driven generation of a multiresolution decomposition of the input to wave-based sensing arrays
US8176230B2 (en) * 2006-04-07 2012-05-08 Kingston Technology Corporation Wireless flash memory card expansion system
US7539533B2 (en) * 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
US20070268481A1 (en) * 2006-05-17 2007-11-22 Ramesh Raskar System and method for measuring scene reflectance using optical sensors
KR100753041B1 (en) * 2006-05-29 2007-08-30 삼성전자주식회사 Mobile terminal with a virtual mode dial and method for operating thereof
JP4208898B2 (en) * 2006-06-09 2009-01-14 株式会社ソニー・コンピュータエンタテインメント Object tracking device and object tracking method
US7816747B2 (en) * 2006-07-06 2010-10-19 International Business Machines Corporation Detector for detecting electromagnetic waves
US20080028118A1 (en) * 2006-07-31 2008-01-31 Craig Peter Sayers Portable dock for a portable computing system
US20080070682A1 (en) * 2006-08-15 2008-03-20 Nintendo Of America Inc. Systems and methods for providing educational games for use by young children, and digital storage mediums for storing the educational games thereon
US7860467B2 (en) * 2006-08-29 2010-12-28 Broadcom Corporation Power control for a dual mode transmitter
US20080070516A1 (en) * 2006-09-15 2008-03-20 Plantronics, Inc. Audio data streaming with auto switching between wireless headset and speakers
US20080076406A1 (en) * 2006-09-22 2008-03-27 Vanu, Inc. Wireless Backhaul
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
US8340057B2 (en) * 2006-12-22 2012-12-25 Canon Kabushiki Kaisha Automated wireless access to peripheral devices
WO2008088870A1 (en) * 2007-01-19 2008-07-24 Progressive Gaming International Corporation Table monitoring identification system, wager tagging and felt coordinate mapping
US9486703B2 (en) * 2007-01-31 2016-11-08 Broadcom Corporation Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
US20090011832A1 (en) * 2007-01-31 2009-01-08 Broadcom Corporation Mobile communication device with game application for display on a remote monitor and methods for use therewith
FI121980B (en) * 2007-02-16 2011-06-30 Voyantic Oy Method for characterizing a radio link
US20080244466A1 (en) * 2007-03-26 2008-10-02 Timothy James Orsley System and method for interfacing with information on a display screen
US20080242414A1 (en) * 2007-03-29 2008-10-02 Broadcom Corporation, A California Corporation Game devices with integrated gyrators and methods for use therewith
US7647071B2 (en) * 2007-03-29 2010-01-12 Broadcom Corporation Communication devices with integrated gyrators and methods for use therewith
JP2008271023A (en) * 2007-04-18 2008-11-06 Univ Of Electro-Communications Antenna system
US10504317B2 (en) * 2007-04-30 2019-12-10 Cfph, Llc Game with player actuated control structure
US8209540B2 (en) * 2007-06-28 2012-06-26 Apple Inc. Incremental secure backup and restore of user settings and data
WO2009062153A1 (en) * 2007-11-09 2009-05-14 Wms Gaming Inc. Interaction with 3d space in a gaming system
US7895365B2 (en) * 2008-02-06 2011-02-22 Broadcom Corporation File storage for a computing device with handheld and extended computing units
CN102016877B (en) * 2008-02-27 2014-12-10 索尼计算机娱乐美国有限责任公司 Methods for capturing depth data of a scene and applying computer actions
US7671802B2 (en) * 2008-03-17 2010-03-02 Disney Enterprises, Inc. Active player tracking
EP2443779B1 (en) * 2009-06-19 2020-08-05 BlackBerry Limited Uplink transmissions for type 2 relay

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040017313A1 (en) * 2002-03-12 2004-01-29 Alberto Menache Motion tracking system and method
US20060274911A1 (en) * 2002-07-27 2006-12-07 Xiadong Mao Tracking device with sound emitter for use in obtaining information for controlling game program execution

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110207444A1 (en) * 2007-06-14 2011-08-25 Hansen Christopher J Method And System For 60 GHZ Location Determination Based On Varying Antenna Direction And Coordination Of WLAN/WPAN/GPS Multimode Devices
US20080311851A1 (en) * 2007-06-14 2008-12-18 Hansen Christopher J Method and system for 60 GHZ location determination and coordination of WLAN/WPAN/GPS multimode devices
US8320877B2 (en) * 2007-06-14 2012-11-27 Broadcom Corporation Method and system for 60 GHz location determination and coordination of WLAN/WPAN/GPS multimode devices
US20120157120A1 (en) * 2007-06-14 2012-06-21 Broadcom Corporation Method and system for 60 ghz location determination and coordination of wlan/wpan/gps multimode devices
US7912449B2 (en) * 2007-06-14 2011-03-22 Broadcom Corporation Method and system for 60 GHz location determination and coordination of WLAN/WPAN/GPS multimode devices
US8126425B2 (en) * 2007-06-14 2012-02-28 Broadcom Corporation Method and system for 60 GHZ location determination based on varying antenna direction and coordination of WLAN/WPAN/GPS multimode devices
US7952962B2 (en) * 2007-06-22 2011-05-31 Broadcom Corporation Directional microphone or microphones for position determination
US20080316863A1 (en) * 2007-06-22 2008-12-25 Broadcom Corporation Directional microphone or microphones for position determination
US20090273340A1 (en) * 2008-05-01 2009-11-05 Cory James Stephanson Self-calibrating magnetic field monitor
US9404988B2 (en) * 2008-05-01 2016-08-02 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
US8120354B2 (en) * 2008-05-01 2012-02-21 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
US20170010335A1 (en) * 2008-05-01 2017-01-12 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
US20120136606A1 (en) * 2008-05-01 2012-05-31 Cory James Stephanson Self-calibrating magnetic field monitor
US20140285183A1 (en) * 2008-05-01 2014-09-25 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
US8729891B2 (en) * 2008-05-01 2014-05-20 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
US10107872B2 (en) * 2008-05-01 2018-10-23 Mis Security, Llc Self-calibrating magnetic field monitor
US20100109849A1 (en) * 2008-10-30 2010-05-06 Nec (China)Co., Ltd. Multi-objects positioning system and power-control based multiple access control method
US8548490B2 (en) * 2008-10-30 2013-10-01 Nec (China) Co., Ltd. Multi-objects positioning system and power-control based multiple access control method
US11397264B2 (en) 2010-02-24 2022-07-26 Sportsmedia Technology Corporation Tracking system
US10613226B2 (en) * 2010-02-24 2020-04-07 Sportsmedia Technology Corporation Tracking system
US11022690B2 (en) 2010-02-24 2021-06-01 Sportsmedia Technology Corporation Tracking system
US11874373B2 (en) 2010-02-24 2024-01-16 Sportsmedia Technology Corporation Tracking system
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US10165228B2 (en) * 2011-12-22 2018-12-25 Mis Security, Llc Sensor event assessor training and integration
US9619999B2 (en) 2011-12-22 2017-04-11 Broadband Discovery Systems, Inc. Sensor event assessor input/output controller
US8902936B2 (en) 2011-12-22 2014-12-02 Cory J. Stephanson Sensor event assessor input/output controller
US20130162813A1 (en) * 2011-12-22 2013-06-27 Cory J. Stephanson Sensor event assessor training and integration
US9225783B2 (en) 2011-12-22 2015-12-29 Cory J. Stephanson Sensor event assessor input/output controller
US9665702B2 (en) 2011-12-23 2017-05-30 Microsoft Technology Licensing, Llc Restricted execution modes
US10249119B2 (en) 2011-12-23 2019-04-02 Microsoft Technology Licensing, Llc Hub key service
US8874162B2 (en) 2011-12-23 2014-10-28 Microsoft Corporation Mobile device safe driving
US9680888B2 (en) 2011-12-23 2017-06-13 Microsoft Technology Licensing, Llc Private interaction hubs
US9710982B2 (en) 2011-12-23 2017-07-18 Microsoft Technology Licensing, Llc Hub key service
US9736655B2 (en) 2011-12-23 2017-08-15 Microsoft Technology Licensing, Llc Mobile device safe driving
US9467834B2 (en) 2011-12-23 2016-10-11 Microsoft Technology Licensing, Llc Mobile device emergency service
US9325752B2 (en) 2011-12-23 2016-04-26 Microsoft Technology Licensing, Llc Private interaction hubs
US9491589B2 (en) 2011-12-23 2016-11-08 Microsoft Technology Licensing, Llc Mobile device safe driving
US9363250B2 (en) 2011-12-23 2016-06-07 Microsoft Technology Licensing, Llc Hub coordination service
US9420432B2 (en) 2011-12-23 2016-08-16 Microsoft Technology Licensing, Llc Mobile devices control
US9230076B2 (en) 2012-08-30 2016-01-05 Microsoft Technology Licensing, Llc Mobile device child share
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US10033945B2 (en) * 2013-12-12 2018-07-24 Flir Systems Ab Orientation-adapted image remote inspection systems and methods
US20150172567A1 (en) * 2013-12-12 2015-06-18 Flir Systems Ab Orientation-adapted image remote inspection systems and methods
US10434396B2 (en) * 2015-11-30 2019-10-08 James Shaunak Divine Protective headgear with display and methods for use therewith
US11103761B2 (en) 2015-11-30 2021-08-31 James Shaunak Divine Protective headgear with display and methods for use therewith
US20170017874A1 (en) * 2016-05-06 2017-01-19 Qualcomm Incorporated Radio frequency identification (rfid) reader with frequency adjustment of continuous radio frequency (rf) wave
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
US20180239419A1 (en) * 2017-02-21 2018-08-23 WiseJet, Inc. Wireless transceiver system using beam tracking
US10747303B2 (en) * 2017-10-13 2020-08-18 Tactual Labs Co. Backscatter hover detection
US20190121424A1 (en) * 2017-10-13 2019-04-25 Tactual Labs Co. Backscatter hover detection
US20210394068A1 (en) * 2020-06-23 2021-12-23 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method

Also Published As

Publication number Publication date
US20170232345A1 (en) 2017-08-17
US8676257B2 (en) 2014-03-18
US20080318683A1 (en) 2008-12-25
US9547080B2 (en) 2017-01-17
US20120315991A1 (en) 2012-12-13
US20170232346A1 (en) 2017-08-17
US20200129861A1 (en) 2020-04-30
US7952962B2 (en) 2011-05-31
US11426660B2 (en) 2022-08-30
US7973702B2 (en) 2011-07-05
US20080318675A1 (en) 2008-12-25
US20080316103A1 (en) 2008-12-25
US20080318684A1 (en) 2008-12-25
US8160640B2 (en) 2012-04-17
US20080316085A1 (en) 2008-12-25
US20090017910A1 (en) 2009-01-15
US20080318680A1 (en) 2008-12-25
US20080318681A1 (en) 2008-12-25
US20090258706A1 (en) 2009-10-15
US8289212B2 (en) 2012-10-16
US9943760B2 (en) 2018-04-17
US20090273559A1 (en) 2009-11-05
US8031121B2 (en) 2011-10-04
US20080318673A1 (en) 2008-12-25
US20080316324A1 (en) 2008-12-25
US8628417B2 (en) 2014-01-14
US20080318689A1 (en) 2008-12-25
US20110312421A1 (en) 2011-12-22
US9417320B2 (en) 2016-08-16
US20080318682A1 (en) 2008-12-25
US9523767B2 (en) 2016-12-20
US20080316863A1 (en) 2008-12-25
US8062133B2 (en) 2011-11-22
US20080318625A1 (en) 2008-12-25
US20130023290A1 (en) 2013-01-24
US10549195B2 (en) 2020-02-04
US20120129606A1 (en) 2012-05-24
US20080318626A1 (en) 2008-12-25
US8311579B2 (en) 2012-11-13
US20080318691A1 (en) 2008-12-25

Similar Documents

Publication Publication Date Title
US20080318595A1 (en) Position location system using multiple position location techniques
US8085199B2 (en) Receiver including a matrix module to determine angular position
Adib et al. {Multi-Person} Localization via {RF} Body Reflections
US10984146B2 (en) Tracking safety conditions of an area
US11893317B2 (en) Method and apparatus for associating digital content with wireless transmission nodes in a wireless communication area
JP2007521474A (en) Radio frequency motion tracking system and method
US11106837B2 (en) Method and apparatus for enhanced position and orientation based information display
Nguyen et al. A review of smartphones‐based indoor positioning: Challenges and applications
Zhang et al. Mobi2Sense: empowering wireless sensing with mobility
US11119179B2 (en) System and method for determining the relative direction of an RF transmitter
US11475177B2 (en) Method and apparatus for improved position and orientation based information display
Andreasson et al. Sensors for mobile robots
US11175368B1 (en) System and method for determining the relative direction of an RF transmitter
US11175369B1 (en) System and method for determining the relative direction of an RF transmitter
KR102604367B1 (en) a high definition positioning and movement capturing device for virtual reality space sevice supply containing eXtended Reality
KR100777600B1 (en) A method and system for motion capture using relative coordinates
KR20140125018A (en) Method and apparatus for recognizing jesture or capturing motion

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROFOUGARAN, AHMADREZA REZA;REEL/FRAME:021133/0655

Effective date: 20080618

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119