US20070150565A1 - Surveillance network system - Google Patents

Surveillance network system Download PDF

Info

Publication number
US20070150565A1
US20070150565A1 US11/317,634 US31763405A US2007150565A1 US 20070150565 A1 US20070150565 A1 US 20070150565A1 US 31763405 A US31763405 A US 31763405A US 2007150565 A1 US2007150565 A1 US 2007150565A1
Authority
US
United States
Prior art keywords
network
information
nodes
sensor
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/317,634
Inventor
Arun Ayyagari
Kevin Ung
Rick Blair
Michael Foster
David Corman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Boeing Co filed Critical Boeing Co
Priority to US11/317,634 priority Critical patent/US20070150565A1/en
Assigned to BOEING COMPANY, THE reassignment BOEING COMPANY, THE ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FOSTER, MICHAEL S., BLAIR, RICK, CORMAN, DAVID E., AYYAGARI, ARUN, UNG, KEVIN Y.
Priority to AT08018658T priority patent/ATE501585T1/en
Priority to PCT/US2006/043573 priority patent/WO2007078422A2/en
Priority to DE602006020637T priority patent/DE602006020637D1/en
Priority to EP08018609.1A priority patent/EP2026536B1/en
Priority to EP06837203A priority patent/EP1969818B1/en
Priority to EP08018658A priority patent/EP2019534B1/en
Publication of US20070150565A1 publication Critical patent/US20070150565A1/en
Priority to US13/223,508 priority patent/US10542093B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W40/00Communication routing or communication path finding
    • H04W40/24Connectivity information management, e.g. connectivity discovery or connectivity update
    • H04W40/30Connectivity information management, e.g. connectivity discovery or connectivity update for proactive routing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L2209/00Additional information or applications relating to cryptographic mechanisms or cryptographic arrangements for secret or secure communication H04L9/00
    • H04L2209/80Wireless
    • H04L2209/805Lightweight hardware, e.g. radio-frequency identification [RFID] or sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/18Self-organising networks, e.g. ad-hoc networks or sensor networks

Definitions

  • the present invention relates generally to surveillance using networks, such as in a military, scientific, civic, or commercial context.
  • surveillance systems may use one or more deployed sensor devices that are capable of passing on collected information to users and/or user devices. For example, users may be able to go into the field and collect such information directly from field devices. More advanced surveillance systems may use some form of remote connection to automatically send collected information back to a data collection system (or the like), so that the collected information can be analyzed, stored and tracked over time, etc.
  • these current systems have limitations, including those related to limited energy supply for field devices, sensor deployment and placement issues, remote information storage and retrieval issues, satellite issues, network bandwidth issues, disruption issues, obstruction issues, etc.
  • information multiplication problems may exist, which may overload human users of the information. For example, current surveillance systems may produce only a small amount of relevant information and a relatively large amount of irrelevant information, which users must then filter through.
  • a sensor network system for surveillance of an environment may be used in commercial operations, civic operations, scientific operations, military operations, etc.
  • the sensor network system Once deployed (e.g., via an aerial and/or terrestrial deployment strategy), the sensor network system may operate intelligently using an autonomous framework. For example, each node in the network system may operate as an individual device with its own job and purpose. For some designated network nodes (e.g., “full function devices”), this job/purpose may require that the network node act intelligently. In such cases, the network node is equipped with some level of processing/decision-making capabilities.
  • the network node is configured only for simple and/or limited-purpose operation (e.g., configured for sensing and performing basic RF communications). In either case, communication with other nodes in the network allows each node to play an autonomous yet active role in the sensor network system. Accordingly, the sensor network system can efficiently react to an array of conditions, fuse relevant data in an intelligent way, and, to varying extents, self-organize and self-manage.
  • a group of sensors that form part of the sensor network system is deployed on a bridge to monitor traffic for enemy presence in a military context.
  • This group of sensors includes various primary sensors that, in this case, are sensitive to vibrations, as well as secondary sensors that, in this case, are image sensors (which include some basic image processing capabilities) and acoustical sensors (which include some basic sound processing capabilities).
  • Some of the secondary sensors in the sensor network system include information fusing capabilities. That is, these sensors have the ability to aggregate information collected by different sensors/nodes to produce more useful information.
  • the sensors in the bridge example are configured to remain in a “sleep mode” with the exception of the primary vibration sensors. If there is activity on the bridge, the vibration sensors will detect it and initiate a process that “wakes” the secondary image sensors and acoustical sensors, which in turn, gather any necessary information. Because some of the image/acoustical sensors in this example are “smart” devices, they can tell whether the traffic on the bridge may be something that human users of the network are interested in. If so, they can activate additional sensors/devices.
  • sensors in the network system may be able to determine the best sensor viewpoints for event data.
  • select intelligent sensors fuse data together, including data received from other endpoints/sensors.
  • the sensors and network nodes then transmit aspects of the collected information to a network controller (e.g., through a set of one or more network routers).
  • the network controller then passes the information on to the appropriate system/external network for user consumption and/or additional processing.
  • the network controller can act as a primary host for application services that allow interchange between nodes of the sensor network and entities within one or more external networks/systems.
  • interactions between the network controllers and the one or more external networks/systems may be based on, for example, a publisher/subscriber model. This configuration reduces the amount of information that human users filter through, conserves energy expenditures at the network nodes (because nodes that are not currently needed can sleep) and allows network resources to be used in an efficient way.
  • FIG. 1 is a system diagram showing an example of a configuration of a sensor network system in an embodiment.
  • FIG. 2 is a system diagram showing and example of one or more network controllers forming a hierarchical network controller system in an embodiment.
  • FIG. 3 is a block diagram showing an embodiment of a sensor network system with features of both a high data rate network and a low data rate network.
  • FIG. 4 is a diagram showing examples of deploying a sensor network system in some embodiments.
  • FIG. 5 is a flow diagram showing an example of a routine for disseminating information to nodes in a sensor network in an embodiment.
  • FIG. 6 is a flow diagram showing an example of a routine for exporting information from nodes in a sensor network.
  • FIG. 7 is a system diagram showing an example of a sensor network configuration based on mission phases in an embodiment.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • FIG. 1 A block diagram illustrating an exemplary computing environment in accordance with the present disclosure.
  • aspects of embodiments of the invention can also be practiced in distributed computing environments, where tasks or modules are performed by remote processing devices that are linked through a communications network.
  • program modules or subroutines may be located in both local and remote memory storage devices.
  • aspects of the invention described below may be stored or distributed on computer-readable media, including magnetic and optically readable and removable computer disks, as well as distributed electronically over networks. Data structures and transmissions of data particular to aspects of the invention are also encompassed within the scope of the invention.
  • FIG. 1 shows an example of a configuration of a sensor network system 100 in an embodiment.
  • the sensor network system 100 may provide various capabilities including self-configuration capabilities, self-healing capabilities, and intelligent cooperative sensing. Other capabilities of the sensor network system 100 may include data storage and retrieval functionality, autonomous decision making capabilities, store and forward capabilities, and resource-aware sensing capabilities.
  • the sensor network system 100 may include at least three classes of devices, including full function devices, reduced function devices, and non-intelligent end devices. More specifically, the full functional and reduced function devices of the sensor network system 100 may include network controllers 105 (full-function devices), network routers 110 (full or reduced function devices), and network-capable end devices 115 (full or reduced function devices) including smart sensors (e.g., sensors with image processing capabilities), each having some level of network capabilities and some possibly functioning as gateways with respect to other network nodes. In some embodiments, the full function devices 105 are knowledgeable about the sensor network topology and are aware of alternate multi-path routes to reach the network controller. The non-intelligent end devices may include a variety of active and/or passive sensors 120 .
  • sensors may include audio/acoustic sensors, imaging sensors, video sensors, infrared sensors, RF sensors, vibration/seismic sensors, magnetic sensors, chemical sensors, etc.
  • the sensors may be low energy and self-contained and provide basic sensor functionality, data dissemination and/or command/control execution. Because they may lack their own network capabilities, for such active and or passive sensors/devices 120 to function as part of the network, they may be used in conjunction with network capable end devices 115 .
  • the sensors may be small (e.g., to prevent detection or remain unobtrusive) and/or come with a casing/shield that protects them against harsh environmental conditions.
  • the sensor devices may be self-powered (e.g., contain long-life batteries, operate on heat or solar energy, etc.) and consume low amounts of energy (e.g., by being energy efficient and having stand-by or inactive modes).
  • image sensors may employ power-aware image compression and storage and power adaptation methods that are tailored to extended low level computation within the sensor.
  • the network connection components of the sensor network system 100 may include both high speed links 125 and low speed links ( 130 and 135 ).
  • high speed links 125 and low speed links 130 and 135 .
  • groups of one or more sensors and/or end devices may be linked to a network router 110 in a “star” configuration.
  • the respective network router 110 (which provides both data routing and network management functionalities) may be linked to one or more other network routers 110 (e.g. using either high speed mesh links 125 or low speed mesh links 130 ), forming a mesh of network routers 110 that are, in-turn, linked to one or more network controllers 105 .
  • aspects of some embodiments of the sensor network system may include use of wireless personal area network (WPAN) technology and/or wireless local area network (WLAN) technology.
  • WPAN wireless personal area network
  • WLAN wireless local area network
  • IEEE standards may be used to implement such wireless network technology, including standards from the IEEE 802.15 family (e.g., 802.15.1, 802.15.2, 802.15.3, 802.15.3a, 802.15.4, etc.) for WPAN and standards from the IEEE 802.11 family (e.g., 802.11a, 802.11b, 802.11g, etc.) for WLAN.
  • any type of data link mechanism may be used including satellite, Bluetooth, and/or infrared/optical techniques, cellular or digital wireless communications, wired or wireless local area network, use of existing network infrastructure, etc., and any combination of such data link mechanisms.
  • intermittent data link mechanisms such as bundled custodial-based network communications, may also be used to conserve resources, including bandwidth.
  • one or more network controllers 105 may form a hierarchical network controller system 200 that manages overall sensor network configuration and operations, performs gateway/proxy functions, and provides access to external networks.
  • each network controller 105 in the network controller system 200 may be configured to accept, remove, and configure devices in the network (e.g., assign addresses and provide routing tables to enable efficient and effective communication of network nodes).
  • the network controllers may also support both dynamic and periodic network synchronization, as well as support peer-to-peer communication among network nodes.
  • the network controllers 105 may issue command and control information to end sensors (described in more detail with respect to FIG. 5 ) and receive data from sensors so that such data may be forwarded to external networks.
  • the network controllers 105 may serve as the primary interface between the external networks and the end device sensors.
  • the network controllers may also support both dynamic and periodic network synchronization and support.
  • network controllers 105 may configure router nodes to perform repeater function to extend the range of the sensor network and perform “frequency management” including implementing spatial reuse plans for the sensor network.
  • the network controllers 105 may each maintain location and operational state information for devices/nodes within the sensor network.
  • the one or more network controllers may be linked to a management subsystem 205 that provides both system management to the sensor networks and information management services to external networks.
  • System management may include planning, deployment, monitoring, and network management.
  • Information management services may include storage, discovery, data transformation, messaging, security, and enterprise service management.
  • the sensor network system (which may consist of many nodes dispersed over a potentially wide geographical area) may employ a combination approach for data interchange consisting of low data rate links capable of information transfer over longer ranges and high data rate links capable of large information transfers over relatively shorter ranges.
  • This type of approach solves problems associated with power consumption and resource conservation in a sensor network system having diverse energy consumption needs (which may be limited and/or fixed) and complex and dynamic communication needs.
  • each sensor node within the sensor network system can vary, (i.e., some sensor nodes have greater computing and communication capabilities than others). While it is true that sensor network systems may function in an ad hoc manner, in some embodiments, communication to and from a particular sensor node is more akin to a client-server interaction.
  • each sensor node may interface with another network node (e.g., a network controller 105 or network router 110 of FIG. 1 ) that functions as a gateway, dynamically establishing a communication link between the nodes that allows for information within the network to be gathered and directed for remote processing in an environment where computing and communication resources are less constrained.
  • the sensor network system can be envisioned as a hierarchical tree structure, such as directional acyclic graph (DAG), with the root node of the hierarchical tree being the gateway node and sensor nodes forming various tiers of child/leaf nodes, as roughly depicted in FIG. 1 .
  • DAG directional acyclic graph
  • this hierarchical tree data model/framework results in sensor nodes closer to the gateway node performing more in-transit forwarding between its higher and lower tier level sensor nodes.
  • this type of conservation is especially desirable for intermediate in-transit forwarding nodes.
  • the sensor network is configured so that at least some of the child/leaf sensor nodes are each able to communicate directly with the gateway node via low data rate links.
  • data-intensive information interchanges between a given child/leaf sensor node and a gateway node may involve multiple intermediary in-transit hops using high data rate links, which have shorter ranges.
  • this combination approach facilitates implementation of a link power budget and/or frequency/spectrum reuse plan.
  • the communication requirements of the sensor network system may differ based on varying levels of network capacity and power needs, as well as mission requirements. For example, many sensor network nodes are sensitive to power consumption, with less capable nodes most likely using less bandwidth and more capable nodes using more bandwidth, since bandwidth is proportional to power consumption (the communication component is typically the highest power drain of any sensor node element). In addition to power consumption, generally, more capable nodes have more data to transmit, are larger, and likely have more capacity for power storage. Less capable nodes are likely to be smaller and need less network bandwidth.
  • a sensor network system 300 in accordance with some embodiments may combine features of a high data rate network 305 with features of a low data rate network 310 .
  • the sensor network system 300 illustrated in FIG. 3 utilizes low data rate communications for the dissemination of, for example, command and control-type information (used in sensor and network management) and the transfer of information among sensor nodes having simple primary transducers and uses high data rate communications for sensor nodes experiencing larger information and data streaming interchanges.
  • the determination of whether to employ either the high data rate 305 or the low data rate network features 310 may be based on a number of factors such as, capability of the node, capabilities of the surrounding nodes, criticality and latency constraints of the data, amount of data to be transferred, physical and logical state of the sensor nodes involved in the interchange, energy use requirements/limits, geographical location, frequency/spectrum reuse plans, etc. This determination may be variable (e.g., it may change from mission to mission, as new resources become available, or even transaction by transaction, as some nodes are configured to use both types of network features).
  • the high data rate network features 305 may provide high bandwidth, short-range connectivity for transferring data-dense information within the network 300 (e.g., by supporting applications that allow for on-demand imaging and video capture and transmission to computing devices performing information and decision support processing).
  • information from array sensor nodes such as image capture sensors, benefit from the movement of larger amounts of data with stringent latency controls favoring high data rate/bandwidth transfer.
  • data movement for the array sensor nodes is likely bursty in nature, event driven, thus favoring high data rate network features, and involves high power requirements.
  • An example of a high data transfer rates may be in the range of gigabits/second or higher (or high megabits), while an example of a low data transfer rates may be in the range of megabits/second or lower.
  • the low data rate network features 310 may provide lower bandwidth, long range connectivity for transferring less dense information within the network (e.g., allowing information transfer from sensors to computing devices performing information and decision support processing) and may be used to monitor and control aspects of both the high data rate network features and the low data rate network features.
  • the dissemination of command and control type information is ubiquitous across the network and occurs more or less continuously.
  • Command and control type messaging typically involves small messages (which use less bandwidth).
  • messages from sensor nodes supporting simple primary transducers, such as vibration and acoustic signatures tend to be small and have low bandwidth requirements.
  • a discrete sensor detects an event, wakes up from its sleep state, gathers data for a pre-determined period and prepares to send the gathered data to an upper layer fusion node. Since this is a low level sensor with minimal capability and is designed to maximize its lifetime through minimum power consumption, it is configured to send data at a minimal data rate.
  • discrete sensor data movement across the network is typically bursty in nature and the messages are likely small to medium in size, which again is facilitated by the use of low to medium bandwidth. Latency may be tightly specified, thus impacting capacity (bandwidth) requirements.
  • particular sensor nodes may be configured for communication using both high data rate network features and low data rate network features.
  • a sleeping video sensor is triggered into operation via a command from the fusion node in response to the data received from a discrete sensor (via a low data rate network features).
  • the video sensor begins operation and in-turn streaming real-time video over the network (via high data rate features).
  • more capable sensor nodes may perform data aggregation and computation functions on behalf of the less capable sensor nodes.
  • more capable nodes can either work as an end device with high data rate mode or as an intermediary node to connect the less capable nodes to the controller.
  • the intermediary nodes typically have both the high data rate and the low data rate. For this type of node, the decision on which data rate to use is made at the application level of the node that runs on the operating system of the sensor node.
  • routing for low data rate network features 310 may be based on hierarchical routing protocols with table-driven optimizations, while routing for high data rate network features 305 may be based on table-driven routing from source to network controller. This type of configuration may permit multiple paths between a given device and network controller for both low and high data rate networks.
  • low data rate network features 310 and the high data rate network features 305 in the context of a protocol stack (e.g., application layer, transport layer, network layer, link layer, physical layer, etc.).
  • IEEE 802.15.4 may be used as a starting point for link and physical layer protocols.
  • access to communication channels may be implemented via carrier sense multiple access with collision avoidance (CSMA/CA). This allows devices accessing the communication channels to maintain low duty cycle, while at the same time supporting a large number of devices.
  • CSMA/CA carrier sense multiple access with collision avoidance
  • a network controller may use low data rate network features 310 to transmit reference signals to various network nodes/devices, thereby announcing its presence and making the network controller detectable to such network nodes/devices.
  • Some embodiments may also employ a time division multiple access (TDMA) beacon structure for implementing low data rate network features, which is useful in cases where dedicated bandwidth and low latency is desirable.
  • TDMA time division multiple access
  • a network controller may transmit beacons at periodic intervals defining a super-frame structure and timing reference.
  • IEEE 802.11 may be used as a starting point for link and physical layer protocols.
  • access to communication channels may be implemented using a TDMA virtual beacon structure.
  • Aspects of the low data rate network features 310 may be used to define the super-frame and timing reference for the high data rate network TDMA structure.
  • the sensor network high data rate network features 305 may also employ a CSMA/CA mechanism as a backup (e.g., when connectivity via low data rate network system is disrupted).
  • the high data rate network features 305 may be limited to interactions among full function devices. Scheduling of network access by such devices may be performed in coordination with a network controller, which allows for information transfer from non-intelligent sensors to reduced functional and/or full function devices performing information and decision support processing.
  • each device may request time references from the network controller to maintain dynamic synchronization is maintained by requesting timing reference from the network controller via the out-of-band low data rate network features prior to the scheduled communication.
  • both endpoint devices e.g., sensors
  • intermediary communication devices may be aware of the route to reach the network controller, which manages the dissemination of the routes.
  • the sensor network system may be configured as a “smart network” that provides appropriate agile connectivity and bandwidth through awareness of network nodes, including monitoring their health, states, and conditions.
  • network controllers, or the like can be used to monitor the health and/or state of network nodes within the network over time.
  • One of the problems this solves is related to the fact that sensor nodes within the sensor network that are not tethered have finite life due to various conditions such as power storage capacity, adverse environmental conditions, or being disabled by external entities such as the enemy.
  • the sensor nodes may be tampered with by external entities to signal erroneous information as a means of denial of service (DoS) attack.
  • DoS denial of service
  • backend sensor network management components such as the management subsystem 205 of FIG. 2 ), or the like, monitor the health status of the sensor nodes to determine the affectivity of each sensor node to determine whether such sensor nodes are capable of performing at or above threshold performance levels.
  • Monitoring the health and/or status of network nodes also enables the management subsystem to determine the validity of the information received from the particular node. For example, the management subsystem may perform authentication (directly and/or indirectly) to verify a node's identity and, thereby, validate the information received from the particular node.
  • a sensor node may be factory programmed with a unique serial number. Prior to deployment, such sensor nodes may also be programmed in the field with unique pre-placed security keys that further facilitate authentication. The management subsystem may then authenticate the sensor node based on its serial number and security keys using challenge/response mechanisms.
  • PKI Public Key Infrastructure
  • the sensor network system can facilitate authentication is through the use of alternate mechanisms, such as challenge/response and RF emission signature comparison.
  • alternate mechanisms such as challenge/response and RF emission signature comparison.
  • each wireless transmitter has a unique RF emission signature.
  • the RF emission signature of a given sensor node can be compared against the RF emission signature profile stored in the management subsystem to verify it's identify.
  • a given sensor node Once the physical identity of a given sensor node has been established, its health status and performance are monitored and profiled by the management subsystem. For example, state conditions that can be monitored include RF signal strength, power consumption, power state, response time, latency, thermal condition, etc. In this way, inconsistencies in the state of a network node (e.g., the occurrence of non-linear changes in the network node's behavior) can signal action by the network.
  • Such action may include terminating the problematic node's participation in the network (e.g., in the case of a node that is not capable of operating correctly or has otherwise been compromised); restricting the node's participation in the network; conducting further diagnostics on the node; reconfiguring the node (e.g., by facilitating a software update); generating a work order for repair of the node, deploying a new replacement node or set of replacement nodes, etc.
  • the monitoring or profiling of network node may be implemented using one or more techniques including advertising/broadcasting by nodes (ranging from dumb devices to reduced function and full function devices) and/or querying by network controllers. Similar techniques may be used for accepting newly deployed nodes into the network.
  • the sensor network system may have multiple sensor nodes collecting data about similar/related environmental parameters. This implies that data gathered from a particular sensor node will very likely be consistent with other sensor nodes within its proximity.
  • nodes within the same vicinity may be those located within a specified threshold distance and/or those positioned geographically in such a way that they can (theoretically) measure the same factor in the environment and provide results within a tolerance range where the mission plan defines the tolerance range.
  • the management subsystem may analyze data received from various sensor nodes and establish the inter-relationships between the data gathered from peer sensor nodes within the same geographical region.
  • the management system may assume that the temperature measurements received from each of these sensor nodes should, theoretically, be within a specified range. Measurements from sensor nodes that are beyond the expected range may then be consider suspect by the management subsystem. Once data is received from a particular sensor node is deemed questionable, the management subsystem can attempt to re-authenticate the sensor and query it for its performance state information. If the management subsystem determines that the integrity of the data from a given sensor node cannot be established, it can appropriately account for it by ignoring data received from the problematic node possibly disabling it. In addition, measurements that fall outside a specified tolerance range may be rejected.
  • the management subsystem may expect data received from a given sensor node within a given temporal period to be within certain bounds based on the dynamics of one or more sensed parameters. For example, multiple data samples from a vibration sensor node within a short period can be expected to follow an estimated trajectory without sudden large deviation.
  • the management subsystem may profile the data received from the given sensor node to ensure that the node is functioning appropriately. Should the received data not meet the specifications, the management subsystem may perform re-authentication and diagnostics and, if need be, ignore data received from and possible disable the particular sensor node if it does not meet the desired performance profile.
  • the sensor network system may be configured for self-deployment, self-configuration, self-organization, and/or self-healing. This allows for the network to be initialized and successfully maintained across wide (and sometimes difficult to access) geographic areas with little or no manual intervention for multiple missions. For example, in many cases, it is simply not viable to expect manual configuration of the sensor network in the field, especially in hostile environments.
  • the sensor network incorporates various self-organization, self-configuration, and self-healing techniques that allow network nodes to be effectively configured, organized, and managed within the sensor network system on an ongoing basis, while eliminating or minimizing the need for human intervention in this regard.
  • deployment of nodes comprising the sensor network system may involve various terrestrial and/or aerial deployment strategies, e.g., so that wireless sensor nodes can be seeded in the field, potentially across wide geographical areas.
  • Deployment may involve dispersal of sensor network devices by persons, robots, unmanned air vehicles (UAVs), ground platforms, etc.
  • UAVs unmanned air vehicles
  • troops or robots may deploy network nodes on the ground using a breadcrumb approach, where devices are dispensed as needed on a path as a person or robot progresses in a surveillance network.
  • sensors may be placed at locations so that every sensor/network node is in communication with at least one other network node.
  • Aerial deployment e.g., by UAV
  • UAV UAV
  • aerial deployment may result in rougher placement of sensors.
  • each of the sensor nodes within the sensor network system interfaces with a gateway node (e.g., a full functional device or a network controller) that allows for information to be gathered and directed for processing at a remote location (e.g., a location where computing and communication resources are not constrained), resulting in a gateway to sensor and sensor to gateway communication model.
  • a gateway node e.g., a full functional device or a network controller
  • the sensor network system can be envisioned as a hierarchical tree structure, such as directional acyclic graph (DAG), with the root node of the hierarchical tree being the gateway node and sensor nodes forming various tiers of child/leaf nodes.
  • DAG directional acyclic graph
  • the gateway node is expected to periodically transmit beacon frames for deployed sensor nodes to synchronize with. This is not an issue for the gateway node since it does not have the power, computing, and communication resource constraints experienced by sensor nodes.
  • One challenge involved in maintaining an effective sensor network that is self-configuring, self-organizing, and self-healing relates to the automatic discovery of the sensor nodes and establishment of the DAG that effectively connects the sensor network to the gateway node, which may be driven by particular mission objectives, and may thus, change over time.
  • each sensor node determines where it stands relative to other nodes (e.g., within the hierarchical tree structure described above). Accordingly, while the seeding of the sensor nodes across a geographical area may be random from a micro level, (i.e., not based on a specific or relative location), distribution of sensor nodes at macro level is organized based on the mission objectives.
  • the more capable sensor nodes establish direct or indirect connectivity with the gateway node for authentication and subsequently to receive command and control information from the gateway node. For example, soon after physical deployment, existing sensor nodes are configured to detect newly deployed nodes and incorporate them into the network in an organized and meaningful way. In one illustrative example, a new set of sensor nodes are physically deployed within the network. Upon deployment, these nodes each broadcast a signal to surrounding nodes in their vicinity (assuming the sensors were deployed in the proper area and such nodes actually exist). In some embodiments, more capable sensor nodes that have already been configured periodically transit beacon frames to enable recently deployed less capable sensor nodes to synchronize and associate with the given more capable sensor node.
  • the more capable sensor nodes also update the gateway node with information and state of the less capable sensor nodes that have been associated with it.
  • the gateway nodes compiles this overall information of the sensor network state to compute the desired topology and routing hierarchy to be used by the sensor network system at each phase of the mission.
  • the computed routing, primary, and alternate, information for each of the more capable sensor nodes is sent to the respective sensor nodes by the gateway node, thereby enabling self-configuring operation of the sensor network.
  • the network controller may be programmed to send out information via lower level gateway nodes to each node that is to be affected by these newly deployed nodes. This information may specify the role/operation of the newly deployed nodes and provide rules of interaction between the new nodes and existing nodes.
  • the network controller may be programmed to send out self-configuration information for the newly deployed nodes, so that they may each be made aware of their specific operation/role within the network.
  • This specific operation/role may be based not only on the capabilities of the deployed nodes, but also on the actual location in which it is deployed.
  • the ultimate role and or operation of a newly deployed node cannot be verified in advance and is not determined until it has come to rest at its location and its actual location coordinates can be determined.
  • the sensor network system may also perform self-configuration and self-organization when faced with instructions to perform a new task, activity, or mission. For example, given a new mission to monitor ground activity within an area defined by a set of coordinates, the network controller may send out new self-configuration/self-organization messages to an affected set of nodes within that area.
  • problems in the network e.g., defective or malfunctioning nodes may also be handled using similar techniques.
  • the network controller is programmed to send out instructions to affected nodes so that they can self-reconfigure to eliminate that node from the network.
  • self-configuration, self-organization, and self-healing is performed via the communication of key information within the network, sample techniques for which are described below with respect to FIGS. 5 and 6 .
  • FIG. 5 provides an example of a routine 500 for disseminating information to nodes in a sensor network in a particular embodiment.
  • users of the sensor network may want to disseminate information to full and/or reduced functional nodes of the network in order to configure the network in accordance with new performance requirements (e.g., as specified in a mission plan). This also facilities the self-organizing and self-managing of the sensor network system.
  • the routine 500 of FIG. 5 is described from the perspective of a gateway node such as a network controller node.
  • the network controller receives network configuration information from a source (e.g., such as would be associated with a new mission plan), such as the management subsystem 205 of FIG. 2 , or some other user-controlled source (including sources from an external network) that has access to the network controller.
  • a source e.g., such as would be associated with a new mission plan
  • the network controller determines which nodes in the sensor network are to receive updated information based on the received network configuration information.
  • the network controller determines a best route for disseminating information to each of the nodes that are to receive updated information.
  • network routing may be handled using Internet protocol (IP) with respect to name-space and packet framing for low and high data rate network features.
  • IP Internet protocol
  • network routing within the sensor network system may involve the network controller defining and then selecting from multiple paths between itself and a given network node.
  • the network controller determines the appropriate route, the information is disseminated to the relevant network nodes, thereby allowing the sensor network to implement the desired configuration updates.
  • the routine 500 then ends.
  • FIG. 6 provides an example of a routine 600 for exporting information from nodes in a sensor network.
  • the routine 600 is performed by an embodiment of a sensor network system.
  • the routine 600 begins at the individual device level and ends at the network controller level.
  • a non-intelligent end device in the network e.g., a vibration sensor, an audio sensor, and RF sensor, etc.
  • reacts to stimulus in its environment by transmitting a signal (e.g., via Bluetooth, 802.11, infrared, RF, etc.) to a reduced function device (e.g., a sensor with image processing capabilities, acoustic processing capabilities, etc.) in the network.
  • a signal e.g., via Bluetooth, 802.11, infrared, RF, etc.
  • a reduced function device e.g., a sensor with image processing capabilities, acoustic processing capabilities, etc.
  • the reduced or full function device (which may be in the proximity of the non-intelligent end device) may wake from a “sleeping” or power-safe mode in response to receipt of the transmitted signal.
  • the awakened reduced or full function device performs appropriate sensing/data collection and processing, as it is programmed to do. This may include decision making with respect to how the device collects information, and what the device, in turn, does with the collected information.
  • the reduced or full function device may collect image information, perform initial processing of that image information and determine that additional surveillance is needed. Based on this, the reduced or full function device may awaken other devices/nodes in the network to perform additional tasks.
  • the reduced or full function device may determine that collected information should be transmitted to another network node, so that the information may be fused with other information that is being collected by nodes in the network. More specifically, smart storage using information fusion of sensor data allows the sensor network to provide only “best of best” information for later communication back to users. It may also provide for graceful loss of event information if in-network storage capacities are exceeded.
  • the reduced or full function device may determine that the collected information should be transmitted to a network controller for exportation outside the network.
  • routine skips forward to block 630 . Otherwise, the routine continues at block 625 , where one or more network controllers may compute and disseminate the routing optimization information (e.g., as a result of request from one or more network nodes). For example, in connection with low data rate network features, the network controller may use hierarchical routing protocols with table-driven optimizations to determine a “best path” at any given time. Such routing optimizations may be implemented using several techniques, such as a cluster tree routing algorithm, an ad hoc on-demand distance vector (AODV) routing protocol, a landmark ad hoc routing (LANMAR) protocol, etc.
  • AODV ad hoc on-demand distance vector
  • LANMAR landmark ad hoc routing
  • select collected data intended for consumption for end users is transferred from one or more network nodes (including high function devices, reduced function devices, and/or other devices) to one or more network controllers. Routing to nodes such as the network controller may be performed using high data rate network features and routing decisions may also be based on table-driven routing information, in which the network controller computes and disseminates routing table information to devices with which it communicates.
  • the information can be exported under an information exportation scheme. For example, this may include real-time updates and/or involve periodic uploads over a network connection. It is also possible to use over-flight data collection mechanisms where network type connections are not available. For example, power efficient store-and-forward communications combined with WLAN techniques allow not only for sensor/network coordination, but also for over-flight data retrieval.
  • the routine 600 then ends.
  • a gateway node such as a network controller manages the operation of the sensor network (e.g., by dynamically creating new communication links between sensor nodes) based on the needs of the mission, which can change throughout the mission based on how the mission progresses.
  • FIG. 7 is a system diagram showing an example of a mission phase-based configuration of a sensor network system in an embodiment.
  • FIG. 7 illustrates the use of the sensor network across a mission having three phases ( 702 , 704 , and 706 ).
  • these mission phases may be determined as the mission progresses, based on real-life conditions (as opposed to being known in advance). As illustrated, not all the sensor nodes need to be active for the entire mission.
  • sensor nodes are configured and organized in a manner that they best serve each mission phase.
  • the system places sensor nodes that are not utilized for a given mission into a deep sleep state to conserve power resources.
  • the system awakens the appropriate sensor nodes for the particular mission phase into active state.
  • the demarcation between sensor nodes used within different mission phases is not mutually exclusive (i.e., certain sensor nodes may be used across multiple mission phases).
  • the gateway nodes performs management of the sensor nodes utilized for a given mission phase.
  • the sensor network is customized based on the needs of the particular mission phase. It is possible that during a given mission phase, some of the sensor nodes may become non-operational for various reasons, such as, power storage capacity, adverse environmental conditions, or being disabled by external entities such as the enemy. This may result in reach-back disruption between the active sensor nodes to the gateway node.
  • the gateway node analyzes the topology map, computes the new routing hierarchy, and commands the appropriate inactive sensor node(s) from deep sleep state into active state. Following this, the gateway node updates the appropriate active sensor nodes with the updated routing, primary and alternate, information thereby enabling self healing operation of the sensor network to fulfill the objectives of the current mission phase(s).

Abstract

Embodiments of a sensor network system provide surveillance capabilities in multiple contexts/environments (e.g., military, commercial, scientific, civic, urban, wilderness, etc.). Network nodes may include devices such as sensors, network routers, network controllers, etc. Network sensors may be configured so that power management objectives are maximized. Network sensors (both individually and as a group) may be capable of intelligent and cooperative information gathering, so that the output of the sensor network does not contain high levels of irrelevant information. The network nodes may communicate among one another via one or more communication links, and in some cases, multiple routes between any two network nodes may be available. The sensor network may include aspects of both high data rate and low data rate network features. One or more network controllers may provide various network management capabilities, including management of network routing, information collection, information exportation, network configuration, etc.

Description

    TECHNICAL FIELD
  • The present invention relates generally to surveillance using networks, such as in a military, scientific, civic, or commercial context.
  • BACKGROUND
  • Many commercial, civic, scientific, and military operations have the need to remotely conduct surveillance of an environment. For example, military groups may have a need to conduct surveillance on a battlefield or in an urban area. Scientists may need to conduct surveillance of a forest or wetland area. Likewise, examples of surveillance activities in a commercial setting include warehouse surveillance, surveillance of large retail establishments, etc.
  • Currently, surveillance systems may use one or more deployed sensor devices that are capable of passing on collected information to users and/or user devices. For example, users may be able to go into the field and collect such information directly from field devices. More advanced surveillance systems may use some form of remote connection to automatically send collected information back to a data collection system (or the like), so that the collected information can be analyzed, stored and tracked over time, etc. However, these current systems have limitations, including those related to limited energy supply for field devices, sensor deployment and placement issues, remote information storage and retrieval issues, satellite issues, network bandwidth issues, disruption issues, obstruction issues, etc. In addition, with respect to large surveillance systems (e.g., those having many sensors), information multiplication problems may exist, which may overload human users of the information. For example, current surveillance systems may produce only a small amount of relevant information and a relatively large amount of irrelevant information, which users must then filter through.
  • SUMMARY
  • The following summary is provided for the benefit of the reader only, and is not intended to limit in any way the invention as set forth by the claims. Aspects of a sensor network system for surveillance of an environment are described herein. Embodiments of the sensor network system may be used in commercial operations, civic operations, scientific operations, military operations, etc. Once deployed (e.g., via an aerial and/or terrestrial deployment strategy), the sensor network system may operate intelligently using an autonomous framework. For example, each node in the network system may operate as an individual device with its own job and purpose. For some designated network nodes (e.g., “full function devices”), this job/purpose may require that the network node act intelligently. In such cases, the network node is equipped with some level of processing/decision-making capabilities. Examples of such capabilities include image processing capabilities, decision fusing capabilities, etc. For other network nodes, this job/purpose may require little, if any, processing capabilities. In such cases, the network node is configured only for simple and/or limited-purpose operation (e.g., configured for sensing and performing basic RF communications). In either case, communication with other nodes in the network allows each node to play an autonomous yet active role in the sensor network system. Accordingly, the sensor network system can efficiently react to an array of conditions, fuse relevant data in an intelligent way, and, to varying extents, self-organize and self-manage.
  • In an illustrative example, a group of sensors that form part of the sensor network system is deployed on a bridge to monitor traffic for enemy presence in a military context. This group of sensors includes various primary sensors that, in this case, are sensitive to vibrations, as well as secondary sensors that, in this case, are image sensors (which include some basic image processing capabilities) and acoustical sensors (which include some basic sound processing capabilities). Some of the secondary sensors in the sensor network system include information fusing capabilities. That is, these sensors have the ability to aggregate information collected by different sensors/nodes to produce more useful information.
  • To conserve energy used by the sensor network system, all the sensors in the bridge example are configured to remain in a “sleep mode” with the exception of the primary vibration sensors. If there is activity on the bridge, the vibration sensors will detect it and initiate a process that “wakes” the secondary image sensors and acoustical sensors, which in turn, gather any necessary information. Because some of the image/acoustical sensors in this example are “smart” devices, they can tell whether the traffic on the bridge may be something that human users of the network are interested in. If so, they can activate additional sensors/devices. For example, by employing time/space based local reasoning (e.g., using feature vectors tied to automated exploitation methods), sensors in the network system may be able to determine the best sensor viewpoints for event data. Using their data-fusing capabilities, select intelligent sensors fuse data together, including data received from other endpoints/sensors.
  • In the bridge example, the sensors and network nodes then transmit aspects of the collected information to a network controller (e.g., through a set of one or more network routers). The network controller then passes the information on to the appropriate system/external network for user consumption and/or additional processing. In this context, the network controller can act as a primary host for application services that allow interchange between nodes of the sensor network and entities within one or more external networks/systems. In some embodiments, interactions between the network controllers and the one or more external networks/systems may be based on, for example, a publisher/subscriber model. This configuration reduces the amount of information that human users filter through, conserves energy expenditures at the network nodes (because nodes that are not currently needed can sleep) and allows network resources to be used in an efficient way.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a system diagram showing an example of a configuration of a sensor network system in an embodiment.
  • FIG. 2 is a system diagram showing and example of one or more network controllers forming a hierarchical network controller system in an embodiment.
  • FIG. 3 is a block diagram showing an embodiment of a sensor network system with features of both a high data rate network and a low data rate network.
  • FIG. 4 is a diagram showing examples of deploying a sensor network system in some embodiments.
  • FIG. 5 is a flow diagram showing an example of a routine for disseminating information to nodes in a sensor network in an embodiment.
  • FIG. 6 is a flow diagram showing an example of a routine for exporting information from nodes in a sensor network.
  • FIG. 7 is a system diagram showing an example of a sensor network configuration based on mission phases in an embodiment.
  • DETAILED DESCRIPTION
  • Certain specific details are set forth in the following description and in FIGS. 1-5 to provide a thorough understanding of various embodiments of the invention. Well-known structures, systems and methods often associated with network environments have not been shown or described in detail to avoid unnecessarily obscuring the description of the various embodiments of the invention. Those of ordinary skill in the relevant art will understand that additional embodiments of the present invention may be practiced without several of the details described below.
  • Many embodiments of the invention described below may take the form of computer-executable instructions, including routines executed by programmable network nodes and computers. Those skilled in the relevant art will appreciate that the invention can be practiced with other computer system and network configurations as well. Aspects of embodiments of the invention can be embodied in a special-purpose computer or data processor that is specifically programmed, configured, or constructed to perform one or more of the computer-executable instructions described below. Accordingly, the term “computer” as generally used herein refers to any data processor and includes Internet appliances, hand-held devices (including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, minicomputers and the like).
  • Aspects of embodiments of the invention can also be practiced in distributed computing environments, where tasks or modules are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules or subroutines may be located in both local and remote memory storage devices. Aspects of the invention described below may be stored or distributed on computer-readable media, including magnetic and optically readable and removable computer disks, as well as distributed electronically over networks. Data structures and transmissions of data particular to aspects of the invention are also encompassed within the scope of the invention.
  • FIG. 1 shows an example of a configuration of a sensor network system 100 in an embodiment. The sensor network system 100 may provide various capabilities including self-configuration capabilities, self-healing capabilities, and intelligent cooperative sensing. Other capabilities of the sensor network system 100 may include data storage and retrieval functionality, autonomous decision making capabilities, store and forward capabilities, and resource-aware sensing capabilities.
  • The sensor network system 100 may include at least three classes of devices, including full function devices, reduced function devices, and non-intelligent end devices. More specifically, the full functional and reduced function devices of the sensor network system 100 may include network controllers 105 (full-function devices), network routers 110 (full or reduced function devices), and network-capable end devices 115 (full or reduced function devices) including smart sensors (e.g., sensors with image processing capabilities), each having some level of network capabilities and some possibly functioning as gateways with respect to other network nodes. In some embodiments, the full function devices 105 are knowledgeable about the sensor network topology and are aware of alternate multi-path routes to reach the network controller. The non-intelligent end devices may include a variety of active and/or passive sensors 120. Examples of types of sensors may include audio/acoustic sensors, imaging sensors, video sensors, infrared sensors, RF sensors, vibration/seismic sensors, magnetic sensors, chemical sensors, etc. For example, in some embodiments, at least some of the sensors may be low energy and self-contained and provide basic sensor functionality, data dissemination and/or command/control execution. Because they may lack their own network capabilities, for such active and or passive sensors/devices 120 to function as part of the network, they may be used in conjunction with network capable end devices 115.
  • As needed, the sensors may be small (e.g., to prevent detection or remain unobtrusive) and/or come with a casing/shield that protects them against harsh environmental conditions. In some embodiments, the sensor devices may be self-powered (e.g., contain long-life batteries, operate on heat or solar energy, etc.) and consume low amounts of energy (e.g., by being energy efficient and having stand-by or inactive modes). For example, in some embodiments, image sensors may employ power-aware image compression and storage and power adaptation methods that are tailored to extended low level computation within the sensor.
  • The network connection components of the sensor network system 100 may include both high speed links 125 and low speed links (130 and 135). For example, as shown in FIG. 1, using multiple low speed star links 135, groups of one or more sensors and/or end devices may be linked to a network router 110 in a “star” configuration. In turn, the respective network router 110 (which provides both data routing and network management functionalities) may be linked to one or more other network routers 110 (e.g. using either high speed mesh links 125 or low speed mesh links 130), forming a mesh of network routers 110 that are, in-turn, linked to one or more network controllers 105.
  • Various types of wireless technologies may be used to implement wireless aspects of the sensor network system. For example, aspects of some embodiments of the sensor network system may include use of wireless personal area network (WPAN) technology and/or wireless local area network (WLAN) technology. Various IEEE standards may be used to implement such wireless network technology, including standards from the IEEE 802.15 family (e.g., 802.15.1, 802.15.2, 802.15.3, 802.15.3a, 802.15.4, etc.) for WPAN and standards from the IEEE 802.11 family (e.g., 802.11a, 802.11b, 802.11g, etc.) for WLAN. In general, however, almost any type of data link mechanism may be used including satellite, Bluetooth, and/or infrared/optical techniques, cellular or digital wireless communications, wired or wireless local area network, use of existing network infrastructure, etc., and any combination of such data link mechanisms. Where possible, intermittent data link mechanisms, such as bundled custodial-based network communications, may also be used to conserve resources, including bandwidth.
  • As shown in FIG. 2, in some embodiments, one or more network controllers 105 may form a hierarchical network controller system 200 that manages overall sensor network configuration and operations, performs gateway/proxy functions, and provides access to external networks. In general, each network controller 105 in the network controller system 200 may be configured to accept, remove, and configure devices in the network (e.g., assign addresses and provide routing tables to enable efficient and effective communication of network nodes). The network controllers may also support both dynamic and periodic network synchronization, as well as support peer-to-peer communication among network nodes. In addition, the network controllers 105 may issue command and control information to end sensors (described in more detail with respect to FIG. 5) and receive data from sensors so that such data may be forwarded to external networks. Thus, the network controllers 105 may serve as the primary interface between the external networks and the end device sensors.
  • The network controllers may also support both dynamic and periodic network synchronization and support. For example, network controllers 105 may configure router nodes to perform repeater function to extend the range of the sensor network and perform “frequency management” including implementing spatial reuse plans for the sensor network. To enable the above functionality, the network controllers 105 may each maintain location and operational state information for devices/nodes within the sensor network.
  • In some embodiments, the one or more network controllers may be linked to a management subsystem 205 that provides both system management to the sensor networks and information management services to external networks. System management may include planning, deployment, monitoring, and network management. Information management services may include storage, discovery, data transformation, messaging, security, and enterprise service management.
  • I. Combining High Data Rate and Low Data Network Features
  • Due to the variability in the communication ranges and the amount/type of data to be interchanged, the sensor network system (which may consist of many nodes dispersed over a potentially wide geographical area) may employ a combination approach for data interchange consisting of low data rate links capable of information transfer over longer ranges and high data rate links capable of large information transfers over relatively shorter ranges. This type of approach solves problems associated with power consumption and resource conservation in a sensor network system having diverse energy consumption needs (which may be limited and/or fixed) and complex and dynamic communication needs.
  • As described above with respect to FIGS. 1 and 2 the computing and communication resources of each sensor node within the sensor network system can vary, (i.e., some sensor nodes have greater computing and communication capabilities than others). While it is true that sensor network systems may function in an ad hoc manner, in some embodiments, communication to and from a particular sensor node is more akin to a client-server interaction. For example, each sensor node may interface with another network node (e.g., a network controller 105 or network router 110 of FIG. 1) that functions as a gateway, dynamically establishing a communication link between the nodes that allows for information within the network to be gathered and directed for remote processing in an environment where computing and communication resources are less constrained. This implies that the sensor network system can be envisioned as a hierarchical tree structure, such as directional acyclic graph (DAG), with the root node of the hierarchical tree being the gateway node and sensor nodes forming various tiers of child/leaf nodes, as roughly depicted in FIG. 1.
  • In some embodiments, this hierarchical tree data model/framework results in sensor nodes closer to the gateway node performing more in-transit forwarding between its higher and lower tier level sensor nodes. To conserve computing and communication resources (thereby conserving power and extending sensor node life), it is sometimes desirable to minimize the number of hops taken by the data flow from the child/leaf sensor nodes to the gateway node. In some sensor network systems, this type of conservation is especially desirable for intermediate in-transit forwarding nodes. Accordingly, in some embodiments the sensor network is configured so that at least some of the child/leaf sensor nodes are each able to communicate directly with the gateway node via low data rate links. In contrast, data-intensive information interchanges between a given child/leaf sensor node and a gateway node may involve multiple intermediary in-transit hops using high data rate links, which have shorter ranges. In some embodiments, this combination approach facilitates implementation of a link power budget and/or frequency/spectrum reuse plan.
  • The communication requirements of the sensor network system may differ based on varying levels of network capacity and power needs, as well as mission requirements. For example, many sensor network nodes are sensitive to power consumption, with less capable nodes most likely using less bandwidth and more capable nodes using more bandwidth, since bandwidth is proportional to power consumption (the communication component is typically the highest power drain of any sensor node element). In addition to power consumption, generally, more capable nodes have more data to transmit, are larger, and likely have more capacity for power storage. Less capable nodes are likely to be smaller and need less network bandwidth.
  • As shown in FIG. 3, a sensor network system 300 in accordance with some embodiments may combine features of a high data rate network 305 with features of a low data rate network 310. To conserve energy, the sensor network system 300 illustrated in FIG. 3 utilizes low data rate communications for the dissemination of, for example, command and control-type information (used in sensor and network management) and the transfer of information among sensor nodes having simple primary transducers and uses high data rate communications for sensor nodes experiencing larger information and data streaming interchanges. For each node, the determination of whether to employ either the high data rate 305 or the low data rate network features 310 may be based on a number of factors such as, capability of the node, capabilities of the surrounding nodes, criticality and latency constraints of the data, amount of data to be transferred, physical and logical state of the sensor nodes involved in the interchange, energy use requirements/limits, geographical location, frequency/spectrum reuse plans, etc. This determination may be variable (e.g., it may change from mission to mission, as new resources become available, or even transaction by transaction, as some nodes are configured to use both types of network features).
  • For example, the high data rate network features 305 may provide high bandwidth, short-range connectivity for transferring data-dense information within the network 300 (e.g., by supporting applications that allow for on-demand imaging and video capture and transmission to computing devices performing information and decision support processing). To further illustrate, information from array sensor nodes, such as image capture sensors, benefit from the movement of larger amounts of data with stringent latency controls favoring high data rate/bandwidth transfer. In addition, data movement for the array sensor nodes is likely bursty in nature, event driven, thus favoring high data rate network features, and involves high power requirements. An example of a high data transfer rates may be in the range of gigabits/second or higher (or high megabits), while an example of a low data transfer rates may be in the range of megabits/second or lower.
  • In contrast, the low data rate network features 310 may provide lower bandwidth, long range connectivity for transferring less dense information within the network (e.g., allowing information transfer from sensors to computing devices performing information and decision support processing) and may be used to monitor and control aspects of both the high data rate network features and the low data rate network features. For example, in some embodiments, the dissemination of command and control type information is ubiquitous across the network and occurs more or less continuously. Command and control type messaging typically involves small messages (which use less bandwidth). Similarly, messages from sensor nodes supporting simple primary transducers, such as vibration and acoustic signatures, tend to be small and have low bandwidth requirements. For example, a discrete sensor detects an event, wakes up from its sleep state, gathers data for a pre-determined period and prepares to send the gathered data to an upper layer fusion node. Since this is a low level sensor with minimal capability and is designed to maximize its lifetime through minimum power consumption, it is configured to send data at a minimal data rate. In general, discrete sensor data movement across the network is typically bursty in nature and the messages are likely small to medium in size, which again is facilitated by the use of low to medium bandwidth. Latency may be tightly specified, thus impacting capacity (bandwidth) requirements.
  • In some cases, particular sensor nodes (e.g., those with intermediate or high capabilities) may be configured for communication using both high data rate network features and low data rate network features. For example, a sleeping video sensor is triggered into operation via a command from the fusion node in response to the data received from a discrete sensor (via a low data rate network features). In response, the video sensor begins operation and in-turn streaming real-time video over the network (via high data rate features). Along similar lines, more capable sensor nodes may perform data aggregation and computation functions on behalf of the less capable sensor nodes. As a result, more capable nodes can either work as an end device with high data rate mode or as an intermediary node to connect the less capable nodes to the controller. The intermediary nodes typically have both the high data rate and the low data rate. For this type of node, the decision on which data rate to use is made at the application level of the node that runs on the operating system of the sensor node.
  • Generally, routing for low data rate network features 310 may be based on hierarchical routing protocols with table-driven optimizations, while routing for high data rate network features 305 may be based on table-driven routing from source to network controller. This type of configuration may permit multiple paths between a given device and network controller for both low and high data rate networks.
  • The following text describes the low data rate network features 310 and the high data rate network features 305 in the context of a protocol stack (e.g., application layer, transport layer, network layer, link layer, physical layer, etc.). With respect to low data rate sensor network features 310, IEEE 802.15.4 may be used as a starting point for link and physical layer protocols. In some embodiments, access to communication channels may be implemented via carrier sense multiple access with collision avoidance (CSMA/CA). This allows devices accessing the communication channels to maintain low duty cycle, while at the same time supporting a large number of devices. When operating under such circumstances, a network controller may use low data rate network features 310 to transmit reference signals to various network nodes/devices, thereby announcing its presence and making the network controller detectable to such network nodes/devices. Some embodiments may also employ a time division multiple access (TDMA) beacon structure for implementing low data rate network features, which is useful in cases where dedicated bandwidth and low latency is desirable. For example, while operating in a beacon mode, a network controller may transmit beacons at periodic intervals defining a super-frame structure and timing reference.
  • With respect to high data rate sensor network features 305, IEEE 802.11 may be used as a starting point for link and physical layer protocols. In some embodiments, access to communication channels may be implemented using a TDMA virtual beacon structure. Aspects of the low data rate network features 310 may be used to define the super-frame and timing reference for the high data rate network TDMA structure. The sensor network high data rate network features 305 may also employ a CSMA/CA mechanism as a backup (e.g., when connectivity via low data rate network system is disrupted).
  • Because of complexities associated with high data rate transmission (e.g., complexities relating to enhanced storage requirements, power requirements, computing requirements, and communication requirements), the high data rate network features 305 may be limited to interactions among full function devices. Scheduling of network access by such devices may be performed in coordination with a network controller, which allows for information transfer from non-intelligent sensors to reduced functional and/or full function devices performing information and decision support processing. Using the low data rate network features 310, each device may request time references from the network controller to maintain dynamic synchronization is maintained by requesting timing reference from the network controller via the out-of-band low data rate network features prior to the scheduled communication. Accordingly, both endpoint devices (e.g., sensors) and intermediary communication devices (routers and other network nodes) may be aware of the route to reach the network controller, which manages the dissemination of the routes.
  • II. Monitoring Network Nodes Based on State (Node Profiling)
  • In some embodiments, the sensor network system may be configured as a “smart network” that provides appropriate agile connectivity and bandwidth through awareness of network nodes, including monitoring their health, states, and conditions. In such a smart network, network controllers, or the like, can be used to monitor the health and/or state of network nodes within the network over time. One of the problems this solves is related to the fact that sensor nodes within the sensor network that are not tethered have finite life due to various conditions such as power storage capacity, adverse environmental conditions, or being disabled by external entities such as the enemy. In addition, the sensor nodes may be tampered with by external entities to signal erroneous information as a means of denial of service (DoS) attack. For these reasons and others, it is beneficial that backend sensor network management components (such as the management subsystem 205 of FIG. 2), or the like, monitor the health status of the sensor nodes to determine the affectivity of each sensor node to determine whether such sensor nodes are capable of performing at or above threshold performance levels.
  • Monitoring the health and/or status of network nodes also enables the management subsystem to determine the validity of the information received from the particular node. For example, the management subsystem may perform authentication (directly and/or indirectly) to verify a node's identity and, thereby, validate the information received from the particular node. In some cases, a sensor node may be factory programmed with a unique serial number. Prior to deployment, such sensor nodes may also be programmed in the field with unique pre-placed security keys that further facilitate authentication. The management subsystem may then authenticate the sensor node based on its serial number and security keys using challenge/response mechanisms. One advantage of this type of authentication scheme includes eliminating the need to perform authentication based on Public Key Infrastructure (PKI), which ordinarily requires nodes to have more advanced computing and communication capabilities.
  • Another that the sensor network system can facilitate authentication is through the use of alternate mechanisms, such as challenge/response and RF emission signature comparison. For example, prior research has shown that each wireless transmitter has a unique RF emission signature. Thus, in some embodiments, the RF emission signature of a given sensor node can be compared against the RF emission signature profile stored in the management subsystem to verify it's identify.
  • Once the physical identity of a given sensor node has been established, its health status and performance are monitored and profiled by the management subsystem. For example, state conditions that can be monitored include RF signal strength, power consumption, power state, response time, latency, thermal condition, etc. In this way, inconsistencies in the state of a network node (e.g., the occurrence of non-linear changes in the network node's behavior) can signal action by the network. Such action may include terminating the problematic node's participation in the network (e.g., in the case of a node that is not capable of operating correctly or has otherwise been compromised); restricting the node's participation in the network; conducting further diagnostics on the node; reconfiguring the node (e.g., by facilitating a software update); generating a work order for repair of the node, deploying a new replacement node or set of replacement nodes, etc. The monitoring or profiling of network node may be implemented using one or more techniques including advertising/broadcasting by nodes (ranging from dumb devices to reduced function and full function devices) and/or querying by network controllers. Similar techniques may be used for accepting newly deployed nodes into the network.
  • The sensor network system may have multiple sensor nodes collecting data about similar/related environmental parameters. This implies that data gathered from a particular sensor node will very likely be consistent with other sensor nodes within its proximity. In this context, nodes within the same vicinity may be those located within a specified threshold distance and/or those positioned geographically in such a way that they can (theoretically) measure the same factor in the environment and provide results within a tolerance range where the mission plan defines the tolerance range. Accordingly, the management subsystem may analyze data received from various sensor nodes and establish the inter-relationships between the data gathered from peer sensor nodes within the same geographical region. For example, if there are temperature sensor nodes within close proximity, then the management system may assume that the temperature measurements received from each of these sensor nodes should, theoretically, be within a specified range. Measurements from sensor nodes that are beyond the expected range may then be consider suspect by the management subsystem. Once data is received from a particular sensor node is deemed questionable, the management subsystem can attempt to re-authenticate the sensor and query it for its performance state information. If the management subsystem determines that the integrity of the data from a given sensor node cannot be established, it can appropriately account for it by ignoring data received from the problematic node possibly disabling it. In addition, measurements that fall outside a specified tolerance range may be rejected.
  • The management subsystem may expect data received from a given sensor node within a given temporal period to be within certain bounds based on the dynamics of one or more sensed parameters. For example, multiple data samples from a vibration sensor node within a short period can be expected to follow an estimated trajectory without sudden large deviation. The management subsystem may profile the data received from the given sensor node to ensure that the node is functioning appropriately. Should the received data not meet the specifications, the management subsystem may perform re-authentication and diagnostics and, if need be, ignore data received from and possible disable the particular sensor node if it does not meet the desired performance profile.
  • III. Node Deployment, Self-Configuration, Self-Organization, and Self-Healing
  • In some embodiments, the sensor network system may be configured for self-deployment, self-configuration, self-organization, and/or self-healing. This allows for the network to be initialized and successfully maintained across wide (and sometimes difficult to access) geographic areas with little or no manual intervention for multiple missions. For example, in many cases, it is simply not viable to expect manual configuration of the sensor network in the field, especially in hostile environments. After nodes are physically deployed, the sensor network incorporates various self-organization, self-configuration, and self-healing techniques that allow network nodes to be effectively configured, organized, and managed within the sensor network system on an ongoing basis, while eliminating or minimizing the need for human intervention in this regard.
  • As shown in FIG. 4, in some embodiments, deployment of nodes comprising the sensor network system may involve various terrestrial and/or aerial deployment strategies, e.g., so that wireless sensor nodes can be seeded in the field, potentially across wide geographical areas. Deployment may involve dispersal of sensor network devices by persons, robots, unmanned air vehicles (UAVs), ground platforms, etc. For example, in a military/combat environment, troops or robots may deploy network nodes on the ground using a breadcrumb approach, where devices are dispensed as needed on a path as a person or robot progresses in a surveillance network. To avoid problems with obstructions that may block network communication, sensors may be placed at locations so that every sensor/network node is in communication with at least one other network node. Aerial deployment (e.g., by UAV) (also illustrated in FIG. 4) is also a possibility in high risk areas, or areas that are difficult to reach from the ground (e.g., active battle zones or wilderness areas). However, aerial deployment may result in rougher placement of sensors.
  • A number of prior publications assume that sensor networks operate as ad hoc networks with a high degree of peer-to-peer communication. While it is true that sensor networks function in an ad hoc manner, communication to and from a particular sensor node is often more akin to a client-server interaction. For example, in some embodiments, each of the sensor nodes within the sensor network system interfaces with a gateway node (e.g., a full functional device or a network controller) that allows for information to be gathered and directed for processing at a remote location (e.g., a location where computing and communication resources are not constrained), resulting in a gateway to sensor and sensor to gateway communication model. As described in preceding sections herein, this implies that the sensor network system can be envisioned as a hierarchical tree structure, such as directional acyclic graph (DAG), with the root node of the hierarchical tree being the gateway node and sensor nodes forming various tiers of child/leaf nodes. In some embodiments, the gateway node is expected to periodically transmit beacon frames for deployed sensor nodes to synchronize with. This is not an issue for the gateway node since it does not have the power, computing, and communication resource constraints experienced by sensor nodes.
  • One challenge involved in maintaining an effective sensor network that is self-configuring, self-organizing, and self-healing relates to the automatic discovery of the sensor nodes and establishment of the DAG that effectively connects the sensor network to the gateway node, which may be driven by particular mission objectives, and may thus, change over time. In other words, as part of the seeding process, each sensor node determines where it stands relative to other nodes (e.g., within the hierarchical tree structure described above). Accordingly, while the seeding of the sensor nodes across a geographical area may be random from a micro level, (i.e., not based on a specific or relative location), distribution of sensor nodes at macro level is organized based on the mission objectives.
  • Once the sensor nodes have been deployed, the more capable sensor nodes establish direct or indirect connectivity with the gateway node for authentication and subsequently to receive command and control information from the gateway node. For example, soon after physical deployment, existing sensor nodes are configured to detect newly deployed nodes and incorporate them into the network in an organized and meaningful way. In one illustrative example, a new set of sensor nodes are physically deployed within the network. Upon deployment, these nodes each broadcast a signal to surrounding nodes in their vicinity (assuming the sensors were deployed in the proper area and such nodes actually exist). In some embodiments, more capable sensor nodes that have already been configured periodically transit beacon frames to enable recently deployed less capable sensor nodes to synchronize and associate with the given more capable sensor node. The more capable sensor nodes also update the gateway node with information and state of the less capable sensor nodes that have been associated with it. The gateway nodes compiles this overall information of the sensor network state to compute the desired topology and routing hierarchy to be used by the sensor network system at each phase of the mission. The computed routing, primary, and alternate, information for each of the more capable sensor nodes is sent to the respective sensor nodes by the gateway node, thereby enabling self-configuring operation of the sensor network.
  • Even if there are no network controllers operating in the immediate vicinity of the newly deployed sensors (i.e., within range of receiving such broadcasted signals), by employing multi-hop techniques (the passing on of information from one node to another to reach an intended destination) an indication of the broadcasted signals eventually reach a network controller capable of managing the self-organization and self-configuration of the network relative to these newly deployed nodes. In particular, the network controller may be programmed to send out information via lower level gateway nodes to each node that is to be affected by these newly deployed nodes. This information may specify the role/operation of the newly deployed nodes and provide rules of interaction between the new nodes and existing nodes. In addition, the network controller may be programmed to send out self-configuration information for the newly deployed nodes, so that they may each be made aware of their specific operation/role within the network. This specific operation/role may be based not only on the capabilities of the deployed nodes, but also on the actual location in which it is deployed. Thus, in some cases where physical deployment at a precise location is difficult to achieve (e.g., with aerial deployment), the ultimate role and or operation of a newly deployed node cannot be verified in advance and is not determined until it has come to rest at its location and its actual location coordinates can be determined.
  • In addition to self-configuration and self-organization based on newly deployed nodes, the sensor network system may also perform self-configuration and self-organization when faced with instructions to perform a new task, activity, or mission. For example, given a new mission to monitor ground activity within an area defined by a set of coordinates, the network controller may send out new self-configuration/self-organization messages to an affected set of nodes within that area. Likewise, problems in the network, (e.g., defective or malfunctioning nodes) may also be handled using similar techniques. For example, if a particular network node is no longer functioning properly an its quality of performance falls below a given level (detected, for example, using the self-monitoring techniques described above), the network controller is programmed to send out instructions to affected nodes so that they can self-reconfigure to eliminate that node from the network.
  • In general, self-configuration, self-organization, and self-healing is performed via the communication of key information within the network, sample techniques for which are described below with respect to FIGS. 5 and 6.
  • FIG. 5 provides an example of a routine 500 for disseminating information to nodes in a sensor network in a particular embodiment. For example, users of the sensor network may want to disseminate information to full and/or reduced functional nodes of the network in order to configure the network in accordance with new performance requirements (e.g., as specified in a mission plan). This also facilities the self-organizing and self-managing of the sensor network system.
  • The routine 500 of FIG. 5 is described from the perspective of a gateway node such as a network controller node. At block 505, the network controller receives network configuration information from a source (e.g., such as would be associated with a new mission plan), such as the management subsystem 205 of FIG. 2, or some other user-controlled source (including sources from an external network) that has access to the network controller. At block 510, the network controller determines which nodes in the sensor network are to receive updated information based on the received network configuration information. At block 515, the network controller determines a best route for disseminating information to each of the nodes that are to receive updated information. In some embodiments, network routing may be handled using Internet protocol (IP) with respect to name-space and packet framing for low and high data rate network features. To improve effectiveness, network routing within the sensor network system may involve the network controller defining and then selecting from multiple paths between itself and a given network node. At block 520, once the network controller determines the appropriate route, the information is disseminated to the relevant network nodes, thereby allowing the sensor network to implement the desired configuration updates. The routine 500 then ends.
  • FIG. 6 provides an example of a routine 600 for exporting information from nodes in a sensor network. The routine 600 is performed by an embodiment of a sensor network system. The routine 600 begins at the individual device level and ends at the network controller level. At block 605 a non-intelligent end device in the network (e.g., a vibration sensor, an audio sensor, and RF sensor, etc.) reacts to stimulus in its environment by transmitting a signal (e.g., via Bluetooth, 802.11, infrared, RF, etc.) to a reduced function device (e.g., a sensor with image processing capabilities, acoustic processing capabilities, etc.) in the network. At block 610, the reduced or full function device (which may be in the proximity of the non-intelligent end device) may wake from a “sleeping” or power-safe mode in response to receipt of the transmitted signal. At block 615, the awakened reduced or full function device performs appropriate sensing/data collection and processing, as it is programmed to do. This may include decision making with respect to how the device collects information, and what the device, in turn, does with the collected information.
  • For example, the reduced or full function device may collect image information, perform initial processing of that image information and determine that additional surveillance is needed. Based on this, the reduced or full function device may awaken other devices/nodes in the network to perform additional tasks. In another example, the reduced or full function device may determine that collected information should be transmitted to another network node, so that the information may be fused with other information that is being collected by nodes in the network. More specifically, smart storage using information fusion of sensor data allows the sensor network to provide only “best of best” information for later communication back to users. It may also provide for graceful loss of event information if in-network storage capacities are exceeded. In yet another example, the reduced or full function device may determine that the collected information should be transmitted to a network controller for exportation outside the network.
  • At decision block 620, if there is no need for the reduced or full function device to communicate with other nodes within the network, the routine skips forward to block 630. Otherwise, the routine continues at block 625, where one or more network controllers may compute and disseminate the routing optimization information (e.g., as a result of request from one or more network nodes). For example, in connection with low data rate network features, the network controller may use hierarchical routing protocols with table-driven optimizations to determine a “best path” at any given time. Such routing optimizations may be implemented using several techniques, such as a cluster tree routing algorithm, an ad hoc on-demand distance vector (AODV) routing protocol, a landmark ad hoc routing (LANMAR) protocol, etc.
  • At block 630, select collected data intended for consumption for end users is transferred from one or more network nodes (including high function devices, reduced function devices, and/or other devices) to one or more network controllers. Routing to nodes such as the network controller may be performed using high data rate network features and routing decisions may also be based on table-driven routing information, in which the network controller computes and disseminates routing table information to devices with which it communicates. Once at the network controllers, the information can be exported under an information exportation scheme. For example, this may include real-time updates and/or involve periodic uploads over a network connection. It is also possible to use over-flight data collection mechanisms where network type connections are not available. For example, power efficient store-and-forward communications combined with WLAN techniques allow not only for sensor/network coordination, but also for over-flight data retrieval. The routine 600 then ends.
  • In some embodiments, a gateway node such as a network controller manages the operation of the sensor network (e.g., by dynamically creating new communication links between sensor nodes) based on the needs of the mission, which can change throughout the mission based on how the mission progresses. FIG. 7 is a system diagram showing an example of a mission phase-based configuration of a sensor network system in an embodiment. In particular, FIG. 7 illustrates the use of the sensor network across a mission having three phases (702, 704, and 706). In some cases, these mission phases may be determined as the mission progresses, based on real-life conditions (as opposed to being known in advance). As illustrated, not all the sensor nodes need to be active for the entire mission. Thus, sensor nodes are configured and organized in a manner that they best serve each mission phase. In some embodiments, the system places sensor nodes that are not utilized for a given mission into a deep sleep state to conserve power resources. As subsequent mission phases begin, the system awakens the appropriate sensor nodes for the particular mission phase into active state. The demarcation between sensor nodes used within different mission phases is not mutually exclusive (i.e., certain sensor nodes may be used across multiple mission phases).
  • In some embodiments, the gateway nodes performs management of the sensor nodes utilized for a given mission phase. Thus, during any given point during the mission, the sensor network is customized based on the needs of the particular mission phase. It is possible that during a given mission phase, some of the sensor nodes may become non-operational for various reasons, such as, power storage capacity, adverse environmental conditions, or being disabled by external entities such as the enemy. This may result in reach-back disruption between the active sensor nodes to the gateway node. Under such circumstances the gateway node analyzes the topology map, computes the new routing hierarchy, and commands the appropriate inactive sensor node(s) from deep sleep state into active state. Following this, the gateway node updates the appropriate active sensor nodes with the updated routing, primary and alternate, information thereby enabling self healing operation of the sensor network to fulfill the objectives of the current mission phase(s).
  • From the foregoing, it will be appreciated that specific embodiments of the invention have been described herein for purposes of illustration, but that various modifications may be made without deviating from the spirit and scope of the invention and aspects of the invention described in the context of particular embodiments may be combined or eliminated in other embodiments. For example, while certain embodiments describe the use of sensor networks operating in a military environment, the invention may be implemented in the context of other environments where a need for surveillance is established.
  • Although advantages associated with certain embodiments of the invention have been described in the context of those embodiments, other embodiments may also exhibit such advantages. Additionally, none of the foregoing embodiments need necessarily exhibit such advantages to fall within the scope of the invention. Accordingly, the invention is not limited except as by the appended claims.

Claims (77)

1. A surveillance network comprising multiple network nodes:
a first set of network nodes configured to communicate within the surveillance network using a first type of communications link;
a second set of network nodes configured to communicate within the surveillance network using both the first type of communication link and a second type of communication link; and
at least one controller that is directly or indirectly linked to each of the multiple network nodes, the controller being capable of establishing whether each of the multiple network nodes is a member of the first set or the second set.
2. The surveillance network of claim 1 wherein the second set of network nodes includes reduced function devices.
3. The surveillance network of claim 1 wherein the first set of network nodes includes endpoint devices, including one or more sensors.
4. The surveillance network of claim 1 wherein the first set of network nodes includes full function devices.
5. The surveillance network of claim 1 wherein at least some of the multiple nodes communicate via a gateway node that functions as a communication hub, and wherein the gateway node is the controller.
6. The surveillance network of claim 1 wherein at least some of the multiple nodes communicate via a gateway node that functions as a communication hub, and wherein the gateway node is a network router.
7. The surveillance network of claim 1 wherein the first type of communications link is associated with a bandwidth allowing low data rate information transfer over a specified range of distances.
8. The surveillance network of claim 1 wherein the second type of communications link is associated with a bandwidth allowing high data rate information transfer over a specified range of distances.
9. A system for conducting surveillance in a physical environment, the system comprising:
multiple network nodes including endpoint devices and one or more network gateways, the endpoint devices having sensors for detecting environmental conditions, and the one or more network gateways functioning as communication hubs for one or more of the multiple network nodes;
means for employing, with respect to a first group of network nodes from the multiple network nodes, a first communication technique associated with a first type of information; and
means for employing, with respect to a second group of network nodes from the multiple network nodes, a second communication technique associated with the first type information and a second type of information.
10. The system of claim 9
wherein the second communication technique is configured for high bandwidth, short-range connectivity for transferring data-dense information, and are implemented using supporting applications that allow for on-demand imaging and video capture and transmission to computing devices performing information and decision support processing,
wherein the first communication technique includes information transfer from sensors to computing devices performing information and decision support processing are configured for low bandwidth, long range connectivity for transferring low-density information and may be used to monitor and control aspects of the communication of nodes in both the first group and the second group.
11. The system of claim 9 wherein the first communication technique is configured for high bandwidth, short-range connectivity for transferring data-dense information.
12. The system of claim 9 wherein the first communication technique is configured for low bandwidth, long range connectivity for transferring low-density information and facilitate monitoring and controlling aspects of the communication of nodes in both the first group and the second group.
13. The system of claim 9 wherein the second communication technique is associated with standards from the IEEE 802.11 family and wherein the first communication technique is associated with standards from the IEEE 802.15 family.
14. The system of claim 9 wherein the first communication technique facilitates information transfer from sensors to computing devices performing information and decision support processing.
15. A method for communication in an information gathering network which gathers information from a physical environment, the system comprising:
designating a first set of nodes in the information gathering network;
designating a second set of nodes in the information gathering network;
assigning a first communication link type to the first set of nodes to allow nodes from the first set of nodes to communicate with other nodes in the information gathering network; and
assigning the first communication link type and a second communication link type to the second set of nodes to allow nodes from the second set to communicate with other nodes in the information gathering network.
16. The method of claim 15 wherein assigning the first communication link type to the first set of nodes and assigning the first communication link type and the second communication link type to the second set of nodes is based on characteristics of information to be communicated within the information gathering network.
17. The method of claim 15 wherein the assigning is further based on power usage requirements of the network node.
18. The method of claim 15 wherein assigning is further based on power usage requirements of a network component sending information to the network node.
19. The method of claim 15 assigning is further based on a distance to a network component which is to receive information from the network node via the selected communication link type.
20. The method of claim 15 wherein first communication link type is associated with low data rate, low bandwidth data transmission that employs link and physical layer protocols in accordance with the IEEE 802.15.4 standard.
21. The method of claim 15 wherein the first communication link type is associated with a low data rate low bandwidth communication link type that includes means for transferring reference signals to one or more network nodes in a carrier sense multiple access with collision avoidance (CSMA/CA) mode.
22. The method of claim 15 wherein the first communication link type is associated with low data rate low bandwidth data transmission that includes means for employing a time division multiple access (TDMA) beacon structure.
23. The method of claim 15 wherein the second communication link type is associated with high data rate high bandwidth data transmission that employs link and physical layer protocols in accordance with the IEEE 802.11 family of standards.
24. The method of claim 15 wherein the first communication link type is associated with low data rate low bandwidth data transmission that employs means to define a super-frame and timing reference transferring information using the second communication link type.
25. The method of claim 15 wherein the second communication link type is associated with high data rate high bandwidth data transmission, and wherein the second communication link type is configured for transferring information among full functioning devices within the information gathering network.
26. A surveillance network comprising:
a controller;
a plurality of sensors being capable of communicating with the controller, each of the plurality of sensors being capable of collecting information from its environment and sending the collected information to the controller;
the controller being responsive to the communication from each of the plurality of sensors to authenticate a given sensor and compare its collected information to the collected information from a set of sensors located in the vicinity of the given sensor.
27. The surveillance network of claim 26 wherein the controller further accepts the collected information if the collected information is in a given tolerances range of the information collected from the set of sensors located in the vicinity of the given sensor and rejects the collected information if the collected information is substantially outside of a tolerance range of the information collected from the set of sensors located in the vicinity of the sensor.
28. The surveillance network of claim 26 wherein the controller further disconnects the communication link to the given sensor if the collected information is substantially outside of a tolerance range of the information collected from the set of sensors located in the vicinity of the sensor.
29. The surveillance network of claim 26 wherein the controller further disconnects the communication link to the given sensor if the collected information is substantially outside of a tolerance range of the information collected from the set of sensors located in the vicinity of the sensor.
30. The surveillance network of claim 26 wherein the controller further disconnects the communication link to the given sensor if the collected information is substantially outside of a tolerance range of the information collected from the set of sensors located in the vicinity of the sensor.
31. The surveillance network of claim 26 wherein the set of sensors is from the plurality of sensors.
32. The surveillance network of claim 26, wherein the controller authenticates the given sensor based on its RF signature.
33. The surveillance network of claim 26 wherein the controller authenticates the given sensor based on its security key.
34. The surveillance network of claim 33 wherein the security pin is field programmed.
35. The surveillance network of claim 26 wherein the controller authenticates each sensor based on its RF signature and security key.
36. The surveillance network of claim 33 wherein the security pin is field programmed.
37. A surveillance method comprising:
collecting information from one or more network nodes within an environment, the collected information relating to a given factor or set of factors;
comparing the collected information to information received from at least two other network nodes in the same environment, wherein the at least two other network nodes are responsible for collecting information that relates to the given factor or set of factors;
accepting the collected information if a variance range between the collected information and the information received from the at least one other network node satisfies a specified threshold; and
rejecting the collected information if the variance range between the collected information and the information received from the at least one other network node does not satisfy a specified threshold.
38. The method of claim 37 wherein the one or more network nodes include multiple sensors operating in at least partial cooperation.
39. The method of claim 37 wherein the one or more network nodes include at least one primary sensor, wherein the at least one primary sensor is configured to sense a designated stimulus and send an activation signal to at least one secondary sensor in its vicinity based on sensing the designated stimulus.
40. A sensor network system comprising:
multiple network nodes configured to perform surveillance activities within a physical environment; and
at least one network controller configured to monitor individual states of the multiple network nodes over time and further configured to detect an inconsistency in a behavior of the at least one network node, determine whether the detected inconsistency requires an action, if the detected inconsistency requires an action, generate instructions relating to the action; and deploy the generated instructions to a subset of the multiple network nodes that includes one or more network nodes affected by the inconsistency.
41. The system of claim 40
wherein detecting an inconsistency in a behavior of the at least one network node includes receiving a broadcasted message from the at least one network node concerning the state of the at least one network node, wherein the monitored individual states include RF signal strength, power consumption, power state, response time, latency, and/or thermal condition;
wherein the inconsistency is detected by a non-linear change in behavior or state and results from the at least one network node malfunctioning; and
wherein the action includes conducting further diagnostics of the at least one network node and either terminating network participation of the at least one network node or reconfiguring the at least one network node.
42. The system of claim 40 wherein the inconsistency is detected by observing a non-linear change in behavior.
43. The system of claim 40 wherein the action includes terminating network participation of the at least one network node.
44. The system of claim 40 wherein the inconsistency results from the at least one network node malfunctioning.
45. The system of claim 40 wherein the action includes conducting further diagnostics of the at least one network node.
46. The system of claim 40 wherein the action includes reconfiguring the at least one network node.
47. The system of claim 40 wherein the action includes facilitating an automatic software update for the at least one autonomous network node.
48. The system of claim 40 wherein the action includes generating a work order for repair of the at least one autonomous network node.
49. The system of claim 40 wherein the action includes deploying a new replacement node or set of replacement nodes.
50. The system of claim 40 wherein automatically detecting an inconsistency in a behavior of the at least one network node includes receiving a broadcasted message from the at least one network node concerning the state of the at least one network node.
51. The system of claim 40 wherein the at least one network node is a newly deployed network node that requires self-configuration instructions.
52. The system of claim 40 wherein the monitored individual states include at least one of RF signal strength, power consumption, power state, response time, latency, and/or thermal condition.
53. A reconfigurable surveillance network comprising:
a controller for receiving a mission plan;
a plurality of sensors being capable of communicating with the controller; and
the controller being responsive to the mission plan to create a network of sensors by generating a communication link with a first set of the multiple sensors based on requirements of the mission.
54. The reconfigurable surveillance network of claim 53 wherein the controller receives an updated mission plan and accordingly disconnects a select number of existing communication links and generates communication links with a second set of the plurality of sensors.
55. The reconfigurable surveillance network of claim 54 wherein the select number is in the range of one to the maximum number of sensors with existing communication links.
56. The reconfigurable surveillance network of claim 53 wherein the controller receives an updated mission plan and accordingly generates communication links with a second set of the multiple sensors.
57. The reconfigurable surveillance network of claim 53 wherein the controller disconnects the communication link of a sensor when a quality of a performance of the sensor falls below a given level and generates a communication link with one of the multiple sensors to replace the sensor with the disconnected communication link.
58. The reconfigurable surveillance network of claim 53 wherein the generated communication link is based on a gateway to sensor and sensor to gateway communication model.
59. The reconfigurable surveillance network of claim 53 wherein the multiple sensors include a first sensor and a second sensor, and wherein information collected by the first sensor and the second sensor is integrated into a single information set based on instructions provided by the controller.
60. The reconfigurable surveillance network of claim 53 wherein the multiple sensors include at least one sensor that lacks processing capabilities and at least one sensor configured for image or acoustical processing.
61. The reconfigurable surveillance network of claim 53 wherein the controller is configured for deployment in a military environment.
62. The reconfigurable surveillance network of claim 53 wherein the controller provides routing information for at least some of the multiple sensors, and wherein the routing information determines how information is routed among the multiple sensors in the information gathering system.
63. A method for self-configuration of a surveillance network with multiple network nodes, the method comprising:
receiving, at a network node, an indication of a new mission, activity, or task to be performed by the surveillance network;
identifying one or more network nodes being capable of performing the new mission, activity, or task; and
enabling each of the identified one or more network nodes to perform the mission.
64. The method of claim 63 wherein the enabling includes generating an operation specification for each of the identified one or more network nodes, wherein the disseminated information enables automatic re-configuration and/or re-organization of the identified one or more network nodes so that the new mission, activity, or task can be performed.
65. The method of claim 63 wherein the generated operation specification for each of the identified one or more network nodes includes software reconfiguration codes.
66. The method of claim 63 wherein enabling each of the identified one or more network nodes to perform the mission includes designating communication links between select network nodes within the surveillance network.
67. A method of self-configuration a surveillance network having multiple network nodes, the method comprising:
receiving an indication of a recently deployed network node;
determining a role or operation specification for the recently deployed network node based, at least in part, on the relative location of the recently deployed network node and the capabilities of the recently deployed network node; and
providing information for dissemination within the surveillance network, the provided information including an indication of a role or operation specification for the recently deployed network node and enabling automatic integration of the recently deployed network node into the surveillance network.
68. The method of claim 67
wherein received indication is a broadcast signal sent from the recently deployed network node, and wherein the received indication includes an indication of the actual location of the recently deployed network node and an indication of the capabilities of the recently deployed network node;
wherein determining the role or operation specification is based, at least in part, on an application of network operation rules locally accessible to the network controller; and
wherein the provided information includes configuration instructions for the recently deployed network node and communication instructions for sensor nodes that are to be in at least intermittent communication with the recently deployed network node during operation of the surveillance network.
69. The method of claim 67 wherein the provided information includes configuration instructions for the recently deployed network node.
70. The method of claim 67 wherein the received indication includes an indication of the actual location of the recently deployed network node.
71. The method of claim 67 wherein the received indication includes an indication of the capabilities of the recently deployed network node.
72. The method of claim 67 wherein determining the role or operation specification is based, at least in part, on an application of network operation rules locally accessible to the network controller.
73. The method of claim 67 the received indication is a broadcast signal sent from the recently deployed network node.
74. The method of claim 67 wherein the received indication is a message received from a network node other than the recently deployed network node, the message being passed based on a multi-hop framework.
75. The method of claim 67 wherein the recently deployed node is configured to broadcast a signal conveying its presence to network nodes in its vicinity, and wherein one or more of the network nodes in its vicinity is configured to generate the received indication based on the broadcast signal.
76. The method of claim 67 wherein the role or operation specification for the recently deployed node is not known until after the recently deployed node is deployed.
77. The method of claim 67 wherein the role or operation specification for the recently deployed node is at least partially known by the network controller before the recently deployed node is deployed.
US11/317,634 2005-12-22 2005-12-22 Surveillance network system Abandoned US20070150565A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
US11/317,634 US20070150565A1 (en) 2005-12-22 2005-12-22 Surveillance network system
AT08018658T ATE501585T1 (en) 2005-12-22 2006-11-07 SENSOR MONITORING NETWORK SYSTEM
PCT/US2006/043573 WO2007078422A2 (en) 2005-12-22 2006-11-07 Surveillance network system
DE602006020637T DE602006020637D1 (en) 2005-12-22 2006-11-07 Sensor monitoring network system
EP08018609.1A EP2026536B1 (en) 2005-12-22 2006-11-07 Sensor surveillance network system
EP06837203A EP1969818B1 (en) 2005-12-22 2006-11-07 Surveillance network system
EP08018658A EP2019534B1 (en) 2005-12-22 2006-11-07 Sensor surveillance network system
US13/223,508 US10542093B2 (en) 2005-12-22 2011-09-01 Surveillance network system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/317,634 US20070150565A1 (en) 2005-12-22 2005-12-22 Surveillance network system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/223,508 Division US10542093B2 (en) 2005-12-22 2011-09-01 Surveillance network system

Publications (1)

Publication Number Publication Date
US20070150565A1 true US20070150565A1 (en) 2007-06-28

Family

ID=37964794

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/317,634 Abandoned US20070150565A1 (en) 2005-12-22 2005-12-22 Surveillance network system
US13/223,508 Active 2026-02-19 US10542093B2 (en) 2005-12-22 2011-09-01 Surveillance network system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/223,508 Active 2026-02-19 US10542093B2 (en) 2005-12-22 2011-09-01 Surveillance network system

Country Status (5)

Country Link
US (2) US20070150565A1 (en)
EP (3) EP2026536B1 (en)
AT (1) ATE501585T1 (en)
DE (1) DE602006020637D1 (en)
WO (1) WO2007078422A2 (en)

Cited By (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070149139A1 (en) * 2004-06-10 2007-06-28 Jean-Louis Gauvreau Wireless Network System with Energy Management
US20070182535A1 (en) * 2006-02-09 2007-08-09 Alps Automotive, Inc. Wireless sourceless sensor
US20070239862A1 (en) * 2006-04-07 2007-10-11 The Mitre Corporation Smart data dissemination
US20080034069A1 (en) * 2005-09-29 2008-02-07 Bruce Schofield Workflow Locked Loops to Enable Adaptive Networks
US20080085722A1 (en) * 2006-10-10 2008-04-10 Radioframe Networks, Inc. Sensing RF environment to detect change in geographic location of cellular base station
US20080085720A1 (en) * 2006-10-10 2008-04-10 Radioframe Networks, Inc. Sensing RF environment to manage mobile network resources
US20080231343A1 (en) * 2007-03-19 2008-09-25 Commtest Instruments, Ltd. Method and System for Vibration Sensing Power Management
US20090063432A1 (en) * 2007-08-28 2009-03-05 Charu Chandra Aggarwal System and Method for Historical Diagnosis of Sensor Networks
US20090150699A1 (en) * 2007-11-29 2009-06-11 Electronics And Telecommunications Research Institute Sleep scheduling method based on moving directions of target in sensor network
US20090213785A1 (en) * 2005-01-25 2009-08-27 Fraunhofer-Gesellschaft Zur Förderung Der Angewand System and Method for the Monitoring of Grouped Objects
US20090288424A1 (en) * 2008-05-23 2009-11-26 Leblond Raymond G Enclosure for surveillance hardware
US20090289788A1 (en) * 2008-05-23 2009-11-26 Leblond Raymond G Peer to peer surveillance architecture
WO2009158637A1 (en) * 2008-06-27 2009-12-30 Qualcomm Incorporated Methods and apparatus for communicating and/or using discovery information
US20100074266A1 (en) * 2007-02-04 2010-03-25 Ki-Hyung Kim Ip-usn with multiple and communication method
US20100125437A1 (en) * 2008-11-17 2010-05-20 Jean-Philippe Vasseur Distributed sample survey technique for data flow reduction in sensor networks
US20100139290A1 (en) * 2008-05-23 2010-06-10 Leblond Raymond G Enclosure for surveillance hardware
US20100169021A1 (en) * 2008-12-31 2010-07-01 Nokia Corporation Earthquake detection apparatus, system, and method
WO2010122555A1 (en) * 2009-04-20 2010-10-28 Ioimage Ltd. Box-to-box camera configuration/reconfiguration
US20100309844A1 (en) * 2009-06-03 2010-12-09 Nec Laboratories America, Inc. Integrated sensor networks with optical and wireless links
US20110013018A1 (en) * 2008-05-23 2011-01-20 Leblond Raymond G Automated camera response in a surveillance architecture
US20110055381A1 (en) * 2009-09-03 2011-03-03 Mcafee, Inc. Host information collection
WO2011033466A1 (en) * 2009-09-17 2011-03-24 Defendec Inc A monitoring method, a monitoring system and a sensor station
US20110141967A1 (en) * 2009-12-14 2011-06-16 Lane Sean L Methods and apparatus related to substantially real-time data transmission and analysis for sensors
US20110251659A1 (en) * 2010-04-09 2011-10-13 Marvin Prescott Method of Managing Metabolic Syndrome
DE102010030082A1 (en) * 2010-06-15 2011-12-15 Endress + Hauser Process Solutions Ag Method for operating wireless fieldbus system in process system, involves performing acyclic data exchange in fieldbus system through gateway when super ordinate process control network is initiated
US20120004782A1 (en) * 2010-06-30 2012-01-05 Motorola, Inc. Methods for managing power consumption in a sensor network
EP2417827A1 (en) * 2009-04-07 2012-02-15 Telefonaktiebolaget LM Ericsson (publ) Attaching a sensor to a wsan
US20120059903A1 (en) * 2010-09-06 2012-03-08 Samsung Electronics Co. Ltd. Method and apparatus for processing sensory information in wireless sensor network
US20120134282A1 (en) * 2010-11-30 2012-05-31 Nokia Corporation Method and apparatus for selecting devices to form a community
KR101176307B1 (en) 2010-12-08 2012-08-23 강릉원주대학교산학협력단 Apparatus of multiplexing data transmission path for wireless sensor network
US20120311042A1 (en) * 2011-06-03 2012-12-06 Fujitsu Limited Distribution method and distribution system
US20130067017A1 (en) * 2010-04-15 2013-03-14 Mxi Technologies, Ltd. Mehtod and system for deployed operations support
US8547876B1 (en) * 2011-04-13 2013-10-01 The United States Of America As Represented By Secretary Of The Navy Intermediate functional device and method
US20130336316A1 (en) * 2012-06-15 2013-12-19 Cisco Technology, Inc. Reliable on-demand distributed data management in a sensor-actuator fabric
US8620342B2 (en) 2006-10-10 2013-12-31 Broadcom Corporation Sensing RF environment to determine geographic location of cellular base station
US20140044016A1 (en) * 2006-10-05 2014-02-13 Cisco Technology, Inc. Upgrading mesh access points in a wireless mesh network
US20140052113A1 (en) * 2012-08-17 2014-02-20 Carl Zeiss Meditec Ag Instrument system and procedure for phacoemulsification
WO2014043761A1 (en) * 2012-09-21 2014-03-27 University Of South Australia Multi-access communication system
US20140092763A1 (en) * 2011-04-08 2014-04-03 Nicola De Carne Method for managing a wireless sensor network, and corresponding sensor node, sensor network, and computer program product
US20140129866A1 (en) * 2012-11-07 2014-05-08 Microsoft Corporation Aggregation framework using low-power alert sensor
GB2512285A (en) * 2013-03-22 2014-10-01 Cambridge Comm Systems Ltd Node partitioning
US20140330905A1 (en) * 2013-05-02 2014-11-06 Electronics And Telecommunications Research Institute Apparatus and method for setting up active networking of smart devices for providing converged service
US20140376384A1 (en) * 2013-06-22 2014-12-25 Tata Consultancy Services Limited System and method for adapting characteristics of application layer protocol using sensed indication
US20150067850A1 (en) * 2013-08-30 2015-03-05 Bank Of America Corporation Ddos detection using sensor grid
FR3014633A1 (en) * 2013-12-11 2015-06-12 Commissariat Energie Atomique DEPLOYMENT OF AD HOC NETWORKS
US20150180716A1 (en) * 2013-07-30 2015-06-25 Google Inc. Mobile computing device and wearable computing device having automatic access mode control
US20150192922A1 (en) * 2012-09-14 2015-07-09 Zte Corporation Industrial control system and management device
US20150271050A1 (en) * 2012-10-15 2015-09-24 Zte Corporation Method and device for determining topology of network
US20150312535A1 (en) * 2014-04-23 2015-10-29 International Business Machines Corporation Self-rousing surveillance system, method and computer program product
CN105025087A (en) * 2015-06-15 2015-11-04 山东大学 Beidou wide-area migration target autonomous internet of things and stereoscopic monitoring integrated device and working method thereof
US9183110B2 (en) * 2012-11-26 2015-11-10 Google Inc. Centralized dispatching of application analytics
WO2015187852A1 (en) * 2014-06-04 2015-12-10 International Mobile Iot Corp. Location-based network system and location-based communication method
US20160044281A1 (en) * 2014-08-07 2016-02-11 Smart Digital LLC Portable Surveillance Device
US9329579B2 (en) * 2010-11-05 2016-05-03 Scanimetrics Inc. Wireless sensor device
US20160142493A1 (en) * 2013-07-31 2016-05-19 Hitachi Solutions, Ltd. Sensor data collection system
WO2016081154A1 (en) * 2014-11-20 2016-05-26 Qualcomm Incorporated Collaborative data capturing apparatuses and methods
WO2016123239A1 (en) * 2015-01-27 2016-08-04 Dragonfly Technology Inc. Systems and methods for providing wireless sensor networks with an asymmetric network architecture
US20160269275A1 (en) * 2013-10-16 2016-09-15 The Regents Of The University Of California A Method for Distance-Vector Routing Using Adaptive Publish-Subscribe Mechanisms
US20170032645A1 (en) * 2015-07-29 2017-02-02 Dell Products, Lp Provisioning and Managing Autonomous Sensors
US20170055205A1 (en) * 2015-08-18 2017-02-23 Covidien Lp Radio network communication modes in physiological status monitoring
US9706489B2 (en) 2015-01-27 2017-07-11 Locix Inc. Systems and methods for providing wireless asymmetric network architectures of wireless devices with anti-collision features
US20170208127A1 (en) * 2014-07-25 2017-07-20 Hewlett Packard Enterprise Development Lp Software-defined sensing
CN107306206A (en) * 2016-04-22 2017-10-31 费希尔-罗斯蒙特系统公司 Wireless mesh network is analyzed and configuration
US20180047646A1 (en) * 2016-02-24 2018-02-15 Kla-Tencor Corporation Accuracy improvements in optical metrology
US10028220B2 (en) 2015-01-27 2018-07-17 Locix, Inc. Systems and methods for providing wireless asymmetric network architectures of wireless devices with power management features
US10057123B1 (en) * 2013-12-27 2018-08-21 Alarm.Com Incorporated Network topology backup
US20180274336A1 (en) * 2017-03-21 2018-09-27 Welltec A/S Downhole completion system
CN109218342A (en) * 2017-06-29 2019-01-15 上海金艺检测技术有限公司 Equipment operating data acquisition processing system based on intelligence sensor
CN109511091A (en) * 2018-11-26 2019-03-22 广州鲁邦通物联网科技有限公司 A kind of BLE MESH network routing algorithm based on location information
WO2019058293A1 (en) * 2017-09-22 2019-03-28 Uab Metapro Holding Equipment for determinig human health status using synchronized data from multiple sensors and method of operation thereof
US10355920B2 (en) * 2016-07-13 2019-07-16 Computational Systems, Inc. Defining acquisition and measurement definitions in a machine monitoring system
US10419540B2 (en) 2015-10-05 2019-09-17 Microsoft Technology Licensing, Llc Architecture for internet of things
US10455368B2 (en) 2015-10-28 2019-10-22 Locix, Inc. Systems and methods for providing communications within wireless sensor networks based on at least one periodic guaranteed time slot for sensor nodes
US10542093B2 (en) 2005-12-22 2020-01-21 The Boeing Company Surveillance network system
US10887817B2 (en) 2014-06-04 2021-01-05 International Mobile Iot Corp. Location-based network system and location-based communication method
US10921167B1 (en) * 2015-09-25 2021-02-16 EMC IP Holding Company LLC Methods and apparatus for validating event scenarios using reference readings from sensors associated with predefined event scenarios
US10938890B2 (en) 2018-03-26 2021-03-02 Toshiba Global Commerce Solutions Holdings Corporation Systems and methods for managing the processing of information acquired by sensors within an environment
CN112752234A (en) * 2020-12-30 2021-05-04 深圳中创艾宝技术有限公司 Communication link transmission method and system based on narrowband wireless ad hoc network
US20210297490A1 (en) * 2020-03-20 2021-09-23 Airbus Operations Gmbh Monitoring system network and method for operating a monitoring system network
DE102020204109A1 (en) 2020-03-30 2021-09-30 Airbus Operations Gmbh MONITORING SYSTEM NETWORK AND METHOD OF OPERATING A MONITORING SYSTEM NETWORK
WO2021232150A1 (en) * 2020-05-19 2021-11-25 National Research Council Of Canada A multi-channel and agnostic hardware-software interface and database architecture for predictive and prescriptive materials discovery
CN113784304A (en) * 2021-09-13 2021-12-10 国网信息通信产业集团有限公司 Communication system
CN113867150A (en) * 2021-10-14 2021-12-31 北京工业大学 Event-driven control method of multi-agent with saturated input
US20220131867A1 (en) * 2020-10-23 2022-04-28 Yokogawa Electric Corporation Device, method, and storage medium
US20220215074A1 (en) * 2019-05-07 2022-07-07 The Nielsen Company (Us), Llc End-point media watermarking
US20220303338A1 (en) * 2021-03-22 2022-09-22 Yokogawa Electric Corporation Commissioning distributed control nodes
GB2616612A (en) * 2022-03-14 2023-09-20 Smarter Tech Group Limited Data transmission
US11856483B2 (en) 2016-07-10 2023-12-26 ZaiNar, Inc. Method and system for radiolocation asset tracking via a mesh network
US11958183B2 (en) 2020-09-18 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9418040B2 (en) * 2005-07-07 2016-08-16 Sciencelogic, Inc. Dynamically deployable self configuring distributed network management system
KR101370290B1 (en) * 2007-07-31 2014-03-05 삼성전자주식회사 Method and apparatus for generating multimedia data with decoding level, and method and apparatus for reconstructing multimedia data with decoding level
DE102007044794A1 (en) * 2007-09-19 2009-04-02 Siemens Ag Method and device for operating an infrastructure network
CA2839247C (en) 2011-06-14 2017-04-04 Abb Research Ltd. Dynamic assigning of bandwidth to field devices in a process control system
ITMI20111993A1 (en) * 2011-11-03 2013-05-04 Milano Politecnico DISTRIBUTED DATA WIRELESS ACQUISITION SYSTEM
US9188668B2 (en) 2012-11-27 2015-11-17 At&T Intellectual Property I, L.P. Electromagnetic reflection profiles
CN103118103A (en) * 2013-01-29 2013-05-22 浪潮电子信息产业股份有限公司 Cloud server framework capable of achieving multi-node interconnection and management
US10075228B2 (en) 2013-04-22 2018-09-11 Latitude Technologies Corporation Aircraft flight data monitoring and reporting system and use thereof
CN104065754A (en) * 2014-07-14 2014-09-24 昆明联诚科技股份有限公司 Wireless sensor network based on P2P technology and construction method thereof
DK3320457T3 (en) 2015-07-10 2021-06-14 Whether or Knot LLC SYSTEM AND METHOD OF ELECTRONIC DATA DISTRIBUTION
CN105391786A (en) * 2015-11-25 2016-03-09 北京华油信通科技有限公司 Logistics transportation vehicle-mounted intelligent sensing network structure
CA3018601C (en) 2016-03-24 2023-10-03 CyPhy Works, Inc. Persistent aerial reconnaissance and communication system
CN108632940B (en) * 2017-03-23 2021-10-08 中国科学院沈阳自动化研究所 Reliable multipath routing algorithm suitable for photoelectric sensor wireless MESH network
EP3451708A1 (en) * 2017-09-01 2019-03-06 BlackBerry Limited Method and system for load balancing of sensors
US10762727B2 (en) * 2017-12-29 2020-09-01 Loon Llc Estimation of aerial vehicle state
US10917293B2 (en) 2018-03-25 2021-02-09 Cisco Technology, Inc. Controller for bulk onboarding
US10666498B2 (en) * 2018-04-26 2020-05-26 Rosemount Aerospace Inc. Architecture for wireless avionics communication networks
US11417223B2 (en) 2020-01-19 2022-08-16 Flir Unmanned Aerial Systems Ulc Flight altitude estimation systems and methods
US11423790B2 (en) 2020-01-19 2022-08-23 Flir Unmanned Aerial Systems Ulc Tether management systems and methods
CN111061299B (en) * 2020-01-21 2020-12-29 南京智能信通科技发展有限公司 Ground sensor data acquisition method based on flight trajectory of unmanned aerial vehicle

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5905715A (en) * 1994-09-01 1999-05-18 British Telecommunications Public Limited Company Network management system for communications networks
US5929748A (en) * 1997-06-12 1999-07-27 Microsoft Corporation Automated home control using existing electrical lines as a communications medium
US6646676B1 (en) * 2000-05-17 2003-11-11 Mitsubishi Electric Research Laboratories, Inc. Networked surveillance and control system
US20040217881A1 (en) * 2003-04-29 2004-11-04 Innovative Technology Licensing, Llc Modular wireless integrated network sensor (WINS) node with a dual bus architecture
US20050218218A1 (en) * 2004-03-31 2005-10-06 Karl Koster Systems and methods for an electronic programmable merchandise tag
US20070103305A1 (en) * 2005-11-08 2007-05-10 Bratkovski Alexandre M Multi-tiered network for gathering detected condition information

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6901530B2 (en) * 2000-08-01 2005-05-31 Qwest Communications International, Inc. Proactive repair process in the xDSL network (with a VDSL focus)
US20030026268A1 (en) * 2000-11-28 2003-02-06 Siemens Technology-To-Business Center, Llc Characteristic routing
EP1386432A4 (en) * 2001-03-21 2009-07-15 John A Stine An access and routing protocol for ad hoc networks using synchronous collision resolution and node state dissemination
US7043660B1 (en) * 2001-10-08 2006-05-09 Agilent Technologies, Inc. System and method for providing distributed fault management policies in a network management system
US7283904B2 (en) * 2001-10-17 2007-10-16 Airbiquity, Inc. Multi-sensor fusion
US20030151513A1 (en) * 2002-01-10 2003-08-14 Falk Herrmann Self-organizing hierarchical wireless network for surveillance and control
US20040098395A1 (en) * 2002-11-18 2004-05-20 Omron Corporation Self-organizing sensor network and method for providing self-organizing sensor network with knowledge data
GB0321041D0 (en) 2003-09-09 2004-02-04 Qinetiq Ltd Sensor apparatus and system
US7119676B1 (en) * 2003-10-09 2006-10-10 Innovative Wireless Technologies, Inc. Method and apparatus for multi-waveform wireless sensor network
WO2005062066A2 (en) * 2003-10-22 2005-07-07 Awarepoint Corporation Wireless position location and tracking system
EP1545069A1 (en) * 2003-12-19 2005-06-22 Sony International (Europe) GmbH Remote polling and control system
AU2003295301A1 (en) * 2003-12-23 2005-07-14 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for efficient routing in ad hoc networks
TWI399049B (en) * 2004-01-13 2013-06-11 Interdigital Tech Corp Orthogonal frequency division multiplexing (ofdm) method and apparatus for protecting and authenticating wirelessly transmitted digital information
JP4303603B2 (en) * 2004-01-13 2009-07-29 株式会社日立製作所 Nuclear medicine diagnostic device and its detector unit
US7388601B2 (en) * 2004-02-18 2008-06-17 Inter-cité Vidéo Inc. System and method for the automated, remote diagnostic of the operation of a digital video recording network
US7303528B2 (en) * 2004-05-18 2007-12-04 Scimed Life Systems, Inc. Serialization of single use endoscopes
US7561057B2 (en) * 2004-05-27 2009-07-14 Lawrence Kates Method and apparatus for detecting severity of water leaks
US7171334B2 (en) * 2004-06-01 2007-01-30 Brion Technologies, Inc. Method and apparatus for synchronizing data acquisition of a monitored IC fabrication process
US7466149B1 (en) * 2004-06-07 2008-12-16 Corr Instruments, Llc. Electronic system and software for multielectrode sensors and electrochemical devices
US7089099B2 (en) * 2004-07-30 2006-08-08 Automotive Technologies International, Inc. Sensor assemblies
US7299042B2 (en) * 2004-07-30 2007-11-20 Pulse-Link, Inc. Common signaling method and apparatus
US20070198675A1 (en) * 2004-10-25 2007-08-23 International Business Machines Corporation Method, system and program product for deploying and allocating an autonomic sensor network ecosystem
US7378962B2 (en) * 2004-12-30 2008-05-27 Sap Aktiengesellschaft Sensor node management and method for monitoring a seal condition of an enclosure
US7605351B2 (en) * 2005-05-06 2009-10-20 Illinois Tool Works Inc. Redundant control circuit for hot melt adhesive hose assembly heater circuits and temperature sensors
US20070150565A1 (en) 2005-12-22 2007-06-28 Arun Ayyagari Surveillance network system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5905715A (en) * 1994-09-01 1999-05-18 British Telecommunications Public Limited Company Network management system for communications networks
US5929748A (en) * 1997-06-12 1999-07-27 Microsoft Corporation Automated home control using existing electrical lines as a communications medium
US6646676B1 (en) * 2000-05-17 2003-11-11 Mitsubishi Electric Research Laboratories, Inc. Networked surveillance and control system
US20040217881A1 (en) * 2003-04-29 2004-11-04 Innovative Technology Licensing, Llc Modular wireless integrated network sensor (WINS) node with a dual bus architecture
US20050218218A1 (en) * 2004-03-31 2005-10-06 Karl Koster Systems and methods for an electronic programmable merchandise tag
US20070103305A1 (en) * 2005-11-08 2007-05-10 Bratkovski Alexandre M Multi-tiered network for gathering detected condition information

Cited By (157)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070149139A1 (en) * 2004-06-10 2007-06-28 Jean-Louis Gauvreau Wireless Network System with Energy Management
US8223730B2 (en) * 2005-01-25 2012-07-17 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. System and method for the monitoring of grouped objects
US20090213785A1 (en) * 2005-01-25 2009-08-27 Fraunhofer-Gesellschaft Zur Förderung Der Angewand System and Method for the Monitoring of Grouped Objects
US20080034069A1 (en) * 2005-09-29 2008-02-07 Bruce Schofield Workflow Locked Loops to Enable Adaptive Networks
US9129253B2 (en) * 2005-09-29 2015-09-08 Rpx Clearinghouse Llc Workflow locked loops to enable adaptive networks to change a policy statement responsive to mission level exceptions and reconfigure the software-controllable network responsive to network level exceptions
US10542093B2 (en) 2005-12-22 2020-01-21 The Boeing Company Surveillance network system
US20070182535A1 (en) * 2006-02-09 2007-08-09 Alps Automotive, Inc. Wireless sourceless sensor
US20070239862A1 (en) * 2006-04-07 2007-10-11 The Mitre Corporation Smart data dissemination
US8892704B2 (en) * 2006-04-07 2014-11-18 The Mitre Corporaton Dynamic rule-based distributed network operation for wireless sensor networks
US20140044016A1 (en) * 2006-10-05 2014-02-13 Cisco Technology, Inc. Upgrading mesh access points in a wireless mesh network
US9980155B2 (en) * 2006-10-05 2018-05-22 Cisco Technology, Inc. Upgrading mesh access points in a wireless mesh network
US20080085720A1 (en) * 2006-10-10 2008-04-10 Radioframe Networks, Inc. Sensing RF environment to manage mobile network resources
US8620342B2 (en) 2006-10-10 2013-12-31 Broadcom Corporation Sensing RF environment to determine geographic location of cellular base station
US8280366B2 (en) * 2006-10-10 2012-10-02 Broadcom Corporation Sensing RF environment to detect change in geographic location of cellular base station
US9332463B2 (en) 2006-10-10 2016-05-03 Broadcom Corporation Sensing RF environment to determine geographic location of cellular base station
US8744466B2 (en) 2006-10-10 2014-06-03 Broadcom Corporation Sensing RF environment to manage mobile network resources
US20080085722A1 (en) * 2006-10-10 2008-04-10 Radioframe Networks, Inc. Sensing RF environment to detect change in geographic location of cellular base station
US20100074266A1 (en) * 2007-02-04 2010-03-25 Ki-Hyung Kim Ip-usn with multiple and communication method
US8238355B2 (en) * 2007-02-04 2012-08-07 Ajou University Industry-Academic Cooperation Foundation IP-USN with multiple and communication method
US20080231343A1 (en) * 2007-03-19 2008-09-25 Commtest Instruments, Ltd. Method and System for Vibration Sensing Power Management
US7676458B2 (en) * 2007-08-28 2010-03-09 International Business Machines Corporation System and method for historical diagnosis of sensor networks
US20090063432A1 (en) * 2007-08-28 2009-03-05 Charu Chandra Aggarwal System and Method for Historical Diagnosis of Sensor Networks
US20090150699A1 (en) * 2007-11-29 2009-06-11 Electronics And Telecommunications Research Institute Sleep scheduling method based on moving directions of target in sensor network
US20090288424A1 (en) * 2008-05-23 2009-11-26 Leblond Raymond G Enclosure for surveillance hardware
US11282380B2 (en) 2008-05-23 2022-03-22 Leverage Information Systems, Inc. Automated camera response in a surveillance architecture
US9918046B2 (en) * 2008-05-23 2018-03-13 Leverage Information Systems, Inc. Peer to peer surveillance architecture
US20160249017A1 (en) * 2008-05-23 2016-08-25 Leverage Information Systems Peer to peer surveillance architecture
US20170104962A1 (en) * 2008-05-23 2017-04-13 Leverage Information Systems, Inc. Peer to peer surveillance architecture
US9661276B2 (en) * 2008-05-23 2017-05-23 Leverage Information Systems, Inc. Peer to peer surveillance architecture
US9786164B2 (en) 2008-05-23 2017-10-10 Leverage Information Systems, Inc. Automated camera response in a surveillance architecture
US9035768B2 (en) * 2008-05-23 2015-05-19 Leverage Information Systems Peer to peer surveillance architecture
US20090289788A1 (en) * 2008-05-23 2009-11-26 Leblond Raymond G Peer to peer surveillance architecture
US20110013018A1 (en) * 2008-05-23 2011-01-20 Leblond Raymond G Automated camera response in a surveillance architecture
US20100139290A1 (en) * 2008-05-23 2010-06-10 Leblond Raymond G Enclosure for surveillance hardware
US20090325601A1 (en) * 2008-06-27 2009-12-31 Qualcomm Incorporated Methods and apparatus for communicating and/or using discovery information
WO2009158637A1 (en) * 2008-06-27 2009-12-30 Qualcomm Incorporated Methods and apparatus for communicating and/or using discovery information
US8332541B2 (en) 2008-06-27 2012-12-11 Qualcomm Incorporated Methods and apparatus for communicating and/or using discovery information
JP2011526471A (en) * 2008-06-27 2011-10-06 クゥアルコム・インコーポレイテッド Method and apparatus for communicating and / or using discovery information
US20100125437A1 (en) * 2008-11-17 2010-05-20 Jean-Philippe Vasseur Distributed sample survey technique for data flow reduction in sensor networks
US8452572B2 (en) * 2008-11-17 2013-05-28 Cisco Technology, Inc. Distributed sample survey technique for data flow reduction in sensor networks
US20100169021A1 (en) * 2008-12-31 2010-07-01 Nokia Corporation Earthquake detection apparatus, system, and method
CN102365901A (en) * 2009-04-07 2012-02-29 瑞典爱立信有限公司 Attaching a sensor to a WSAN
EP2417827A1 (en) * 2009-04-07 2012-02-15 Telefonaktiebolaget LM Ericsson (publ) Attaching a sensor to a wsan
EP2417827A4 (en) * 2009-04-07 2014-03-05 Ericsson Telefon Ab L M Attaching a sensor to a wsan
US9398267B2 (en) * 2009-04-20 2016-07-19 Flir Commercial Systems, Inc. Box-to-box camera configuration/reconfiguration
US20120229648A1 (en) * 2009-04-20 2012-09-13 Aharon Kass Box-to-box camera configuration/reconfiguration
WO2010122555A1 (en) * 2009-04-20 2010-10-28 Ioimage Ltd. Box-to-box camera configuration/reconfiguration
US20100309844A1 (en) * 2009-06-03 2010-12-09 Nec Laboratories America, Inc. Integrated sensor networks with optical and wireless links
US8588135B2 (en) * 2009-06-03 2013-11-19 Nec Laboratories America, Inc. Integrated sensor networks with optical and wireless links
US9391858B2 (en) * 2009-09-03 2016-07-12 Mcafee, Inc. Host information collection
US20110055381A1 (en) * 2009-09-03 2011-03-03 Mcafee, Inc. Host information collection
WO2011033466A1 (en) * 2009-09-17 2011-03-24 Defendec Inc A monitoring method, a monitoring system and a sensor station
US8624729B2 (en) 2009-09-17 2014-01-07 Defendec Inc. Monitoring method, a monitoring system and a sensor station
US20110141967A1 (en) * 2009-12-14 2011-06-16 Lane Sean L Methods and apparatus related to substantially real-time data transmission and analysis for sensors
US20110251659A1 (en) * 2010-04-09 2011-10-13 Marvin Prescott Method of Managing Metabolic Syndrome
US20130067017A1 (en) * 2010-04-15 2013-03-14 Mxi Technologies, Ltd. Mehtod and system for deployed operations support
US9898703B2 (en) * 2010-04-15 2018-02-20 Mxi Technologies, Ltd. Method and system for deployed operations support
DE102010030082A1 (en) * 2010-06-15 2011-12-15 Endress + Hauser Process Solutions Ag Method for operating wireless fieldbus system in process system, involves performing acyclic data exchange in fieldbus system through gateway when super ordinate process control network is initiated
US8509923B2 (en) * 2010-06-30 2013-08-13 Motorola Solutions, Inc. Methods for managing power consumption in a sensor network
US20120004782A1 (en) * 2010-06-30 2012-01-05 Motorola, Inc. Methods for managing power consumption in a sensor network
US20120059903A1 (en) * 2010-09-06 2012-03-08 Samsung Electronics Co. Ltd. Method and apparatus for processing sensory information in wireless sensor network
US9329579B2 (en) * 2010-11-05 2016-05-03 Scanimetrics Inc. Wireless sensor device
US20120134282A1 (en) * 2010-11-30 2012-05-31 Nokia Corporation Method and apparatus for selecting devices to form a community
KR101176307B1 (en) 2010-12-08 2012-08-23 강릉원주대학교산학협력단 Apparatus of multiplexing data transmission path for wireless sensor network
US20140092763A1 (en) * 2011-04-08 2014-04-03 Nicola De Carne Method for managing a wireless sensor network, and corresponding sensor node, sensor network, and computer program product
US8547876B1 (en) * 2011-04-13 2013-10-01 The United States Of America As Represented By Secretary Of The Navy Intermediate functional device and method
US8656036B2 (en) * 2011-06-03 2014-02-18 Fujitsu Limited Distribution method and distribution system
US20120311042A1 (en) * 2011-06-03 2012-12-06 Fujitsu Limited Distribution method and distribution system
US9059929B2 (en) * 2012-06-15 2015-06-16 Cisco Technology, Inc. Reliable on-demand distributed data management in a sensor-actuator fabric
US20130336316A1 (en) * 2012-06-15 2013-12-19 Cisco Technology, Inc. Reliable on-demand distributed data management in a sensor-actuator fabric
US20140052113A1 (en) * 2012-08-17 2014-02-20 Carl Zeiss Meditec Ag Instrument system and procedure for phacoemulsification
US20150192922A1 (en) * 2012-09-14 2015-07-09 Zte Corporation Industrial control system and management device
WO2014043761A1 (en) * 2012-09-21 2014-03-27 University Of South Australia Multi-access communication system
US10594425B2 (en) 2012-09-21 2020-03-17 Myriota Pty Ltd Multi-access communication system
US9668210B2 (en) 2012-09-21 2017-05-30 University Of South Australia Multi-access communication system
US11438084B2 (en) 2012-09-21 2022-09-06 Myriota Pty Ltd Multi-access communication system
AU2017201816B2 (en) * 2012-09-21 2017-06-08 Myriota Pty Ltd Multi-access communication system
US20150271050A1 (en) * 2012-10-15 2015-09-24 Zte Corporation Method and device for determining topology of network
US9503356B2 (en) * 2012-10-15 2016-11-22 Zte Corporation Method and device for determining topology of network
CN104782103A (en) * 2012-11-07 2015-07-15 微软公司 Aggregation framework using low-power alert sensor
US9325792B2 (en) * 2012-11-07 2016-04-26 Microsoft Technology Licensing, Llc Aggregation framework using low-power alert sensor
US20140129866A1 (en) * 2012-11-07 2014-05-08 Microsoft Corporation Aggregation framework using low-power alert sensor
US9183110B2 (en) * 2012-11-26 2015-11-10 Google Inc. Centralized dispatching of application analytics
US9606895B2 (en) 2012-11-26 2017-03-28 Google Inc. Centralized dispatching of application analytics
US10331539B2 (en) 2012-11-26 2019-06-25 Google Llc Centralized dispatching of application analytics
GB2512285B (en) * 2013-03-22 2015-09-30 Cambridge Comm Systems Ltd Node partitioning
GB2512285A (en) * 2013-03-22 2014-10-01 Cambridge Comm Systems Ltd Node partitioning
US10187926B2 (en) * 2013-05-02 2019-01-22 Electronics And Telecommunications Research Institute Apparatus and method for setting up active networking of smart devices for providing converged service
US20140330905A1 (en) * 2013-05-02 2014-11-06 Electronics And Telecommunications Research Institute Apparatus and method for setting up active networking of smart devices for providing converged service
US9479948B2 (en) * 2013-06-22 2016-10-25 Tata Consultancy Services Limited System and method for adapting characteristics of application layer protocol using sensed indication
US20140376384A1 (en) * 2013-06-22 2014-12-25 Tata Consultancy Services Limited System and method for adapting characteristics of application layer protocol using sensed indication
US10194271B2 (en) 2013-07-30 2019-01-29 Google Llc Mobile computing device and wearable computing device having automatic access mode control
US9647887B2 (en) * 2013-07-30 2017-05-09 Google Inc. Mobile computing device and wearable computing device having automatic access mode control
US10721589B2 (en) 2013-07-30 2020-07-21 Google Llc Mobile computing device and wearable computing device having automatic access mode control
US20150180716A1 (en) * 2013-07-30 2015-06-25 Google Inc. Mobile computing device and wearable computing device having automatic access mode control
US20160142493A1 (en) * 2013-07-31 2016-05-19 Hitachi Solutions, Ltd. Sensor data collection system
US20150067850A1 (en) * 2013-08-30 2015-03-05 Bank Of America Corporation Ddos detection using sensor grid
US9060021B2 (en) * 2013-08-30 2015-06-16 Bank Of America Corporation DDoS detection using sensor grid
US20160269275A1 (en) * 2013-10-16 2016-09-15 The Regents Of The University Of California A Method for Distance-Vector Routing Using Adaptive Publish-Subscribe Mechanisms
US20190132235A1 (en) * 2013-10-16 2019-05-02 The Regents Of The University Of California Method for distance-vector routing using adaptive publish-subscribe mechanisms
US10091094B2 (en) * 2013-10-16 2018-10-02 The Regents Of The University Of California Method for distance-vector routing using adaptive publish-subscribe mechanisms
WO2015086331A1 (en) * 2013-12-11 2015-06-18 Commissariat A L'energie Atomique Et Aux Energies Alternatives Deployment of ad-hoc networks
FR3014633A1 (en) * 2013-12-11 2015-06-12 Commissariat Energie Atomique DEPLOYMENT OF AD HOC NETWORKS
US11038756B1 (en) 2013-12-27 2021-06-15 Alarm.Com Incorporated Network topology backup
US10530651B1 (en) 2013-12-27 2020-01-07 Alarm.Com Incorporated Network topology backup
US11695633B2 (en) 2013-12-27 2023-07-04 Alarm.Com Incorporated Network topology backup
US10057123B1 (en) * 2013-12-27 2018-08-21 Alarm.Com Incorporated Network topology backup
US20150312535A1 (en) * 2014-04-23 2015-10-29 International Business Machines Corporation Self-rousing surveillance system, method and computer program product
WO2015187852A1 (en) * 2014-06-04 2015-12-10 International Mobile Iot Corp. Location-based network system and location-based communication method
US10887817B2 (en) 2014-06-04 2021-01-05 International Mobile Iot Corp. Location-based network system and location-based communication method
US20220027204A1 (en) * 2014-07-25 2022-01-27 Hewlett Packard Enterprise Development Lp Software-defined sensing
US11159618B2 (en) * 2014-07-25 2021-10-26 Hewlett Packard Enterprise Development Lp Software-defined sensing
US20170208127A1 (en) * 2014-07-25 2017-07-20 Hewlett Packard Enterprise Development Lp Software-defined sensing
US11943300B2 (en) * 2014-07-25 2024-03-26 Hewlett Packard Enterprise Development Lp Software-defined sensing
US20160044281A1 (en) * 2014-08-07 2016-02-11 Smart Digital LLC Portable Surveillance Device
US9877292B2 (en) 2014-11-20 2018-01-23 Qualcomm Incorporated Collaborative data capturing apparatuses and methods
WO2016081154A1 (en) * 2014-11-20 2016-05-26 Qualcomm Incorporated Collaborative data capturing apparatuses and methods
WO2016123239A1 (en) * 2015-01-27 2016-08-04 Dragonfly Technology Inc. Systems and methods for providing wireless sensor networks with an asymmetric network architecture
US10028220B2 (en) 2015-01-27 2018-07-17 Locix, Inc. Systems and methods for providing wireless asymmetric network architectures of wireless devices with power management features
US11924757B2 (en) 2015-01-27 2024-03-05 ZaiNar, Inc. Systems and methods for providing wireless asymmetric network architectures of wireless devices with power management features
US9706489B2 (en) 2015-01-27 2017-07-11 Locix Inc. Systems and methods for providing wireless asymmetric network architectures of wireless devices with anti-collision features
CN105025087A (en) * 2015-06-15 2015-11-04 山东大学 Beidou wide-area migration target autonomous internet of things and stereoscopic monitoring integrated device and working method thereof
US9652963B2 (en) * 2015-07-29 2017-05-16 Dell Products, Lp Provisioning and managing autonomous sensors
US20170032645A1 (en) * 2015-07-29 2017-02-02 Dell Products, Lp Provisioning and Managing Autonomous Sensors
US20170238123A1 (en) * 2015-07-29 2017-08-17 Dell Products, Lp Provisioning and Managing Autonomous Sensors
US20170055205A1 (en) * 2015-08-18 2017-02-23 Covidien Lp Radio network communication modes in physiological status monitoring
US10652808B2 (en) * 2015-08-18 2020-05-12 Covidien LLP Radio network communication modes in physiological status monitoring
US10921167B1 (en) * 2015-09-25 2021-02-16 EMC IP Holding Company LLC Methods and apparatus for validating event scenarios using reference readings from sensors associated with predefined event scenarios
US10419540B2 (en) 2015-10-05 2019-09-17 Microsoft Technology Licensing, Llc Architecture for internet of things
US10455368B2 (en) 2015-10-28 2019-10-22 Locix, Inc. Systems and methods for providing communications within wireless sensor networks based on at least one periodic guaranteed time slot for sensor nodes
US20180047646A1 (en) * 2016-02-24 2018-02-15 Kla-Tencor Corporation Accuracy improvements in optical metrology
US10212080B2 (en) 2016-04-22 2019-02-19 Fisher-Rosemount Systems, Inc. Wireless mesh network analysis and configuration
GB2549852B (en) * 2016-04-22 2021-09-29 Fisher Rosemount Systems Inc Wireless mesh network analysis and configuration
GB2549852A (en) * 2016-04-22 2017-11-01 Fisher Rosemount Systems Inc Wireless mesh network analysis and configuration
CN107306206A (en) * 2016-04-22 2017-10-31 费希尔-罗斯蒙特系统公司 Wireless mesh network is analyzed and configuration
US11856483B2 (en) 2016-07-10 2023-12-26 ZaiNar, Inc. Method and system for radiolocation asset tracking via a mesh network
US10355920B2 (en) * 2016-07-13 2019-07-16 Computational Systems, Inc. Defining acquisition and measurement definitions in a machine monitoring system
RU2754903C2 (en) * 2017-03-21 2021-09-08 Веллтек Ойлфилд Солюшнс АГ Well completion system
US10774619B2 (en) * 2017-03-21 2020-09-15 Welltec Oilfield Solutions Ag Downhole completion system
US20180274336A1 (en) * 2017-03-21 2018-09-27 Welltec A/S Downhole completion system
CN109218342A (en) * 2017-06-29 2019-01-15 上海金艺检测技术有限公司 Equipment operating data acquisition processing system based on intelligence sensor
WO2019058293A1 (en) * 2017-09-22 2019-03-28 Uab Metapro Holding Equipment for determinig human health status using synchronized data from multiple sensors and method of operation thereof
US10938890B2 (en) 2018-03-26 2021-03-02 Toshiba Global Commerce Solutions Holdings Corporation Systems and methods for managing the processing of information acquired by sensors within an environment
CN109511091A (en) * 2018-11-26 2019-03-22 广州鲁邦通物联网科技有限公司 A kind of BLE MESH network routing algorithm based on location information
US20220215074A1 (en) * 2019-05-07 2022-07-07 The Nielsen Company (Us), Llc End-point media watermarking
US20210297490A1 (en) * 2020-03-20 2021-09-23 Airbus Operations Gmbh Monitoring system network and method for operating a monitoring system network
DE102020204111A1 (en) 2020-03-30 2021-09-30 Airbus Operations Gmbh MONITORING SYSTEM NETWORK AND METHOD OF OPERATING A MONITORING SYSTEM NETWORK
DE102020204109A1 (en) 2020-03-30 2021-09-30 Airbus Operations Gmbh MONITORING SYSTEM NETWORK AND METHOD OF OPERATING A MONITORING SYSTEM NETWORK
WO2021232150A1 (en) * 2020-05-19 2021-11-25 National Research Council Of Canada A multi-channel and agnostic hardware-software interface and database architecture for predictive and prescriptive materials discovery
US11958183B2 (en) 2020-09-18 2024-04-16 The Research Foundation For The State University Of New York Negotiation-based human-robot collaboration via augmented reality
US20220131867A1 (en) * 2020-10-23 2022-04-28 Yokogawa Electric Corporation Device, method, and storage medium
CN112752234A (en) * 2020-12-30 2021-05-04 深圳中创艾宝技术有限公司 Communication link transmission method and system based on narrowband wireless ad hoc network
US20220303338A1 (en) * 2021-03-22 2022-09-22 Yokogawa Electric Corporation Commissioning distributed control nodes
US11750696B2 (en) * 2021-03-22 2023-09-05 Yokogawa Electric Corporation Commissioning distributed control nodes
CN113784304A (en) * 2021-09-13 2021-12-10 国网信息通信产业集团有限公司 Communication system
CN113867150A (en) * 2021-10-14 2021-12-31 北京工业大学 Event-driven control method of multi-agent with saturated input
GB2616612A (en) * 2022-03-14 2023-09-20 Smarter Tech Group Limited Data transmission

Also Published As

Publication number Publication date
EP2019534B1 (en) 2011-03-09
EP1969818B1 (en) 2013-04-03
EP2026536A1 (en) 2009-02-18
EP2019534A1 (en) 2009-01-28
EP2026536B1 (en) 2013-04-24
EP1969818A2 (en) 2008-09-17
DE602006020637D1 (en) 2011-04-21
WO2007078422A2 (en) 2007-07-12
ATE501585T1 (en) 2011-03-15
US10542093B2 (en) 2020-01-21
WO2007078422A3 (en) 2007-09-27
US20120084839A1 (en) 2012-04-05

Similar Documents

Publication Publication Date Title
US10542093B2 (en) Surveillance network system
Carlos-Mancilla et al. Wireless sensor networks formation: approaches and techniques
Rahman A survey on sensor network
Akkaya et al. A survey on routing protocols for wireless sensor networks
Jain et al. Current trends in wireless sensor network design
Halder et al. LiMCA: an optimal clustering algorithm for lifetime maximization of internet of things
Younis et al. On efficient clustering of wireless sensor networks
Renold et al. Survey on state scheduling-based topology control in unattended wireless sensor networks
Chang Wireless sensor networks and applications
Sachan et al. A survey of energy-efficient communication protocols in WSNs
Fahmy Cross-Layer Protocols for WSNs
Hong et al. Cost-efficient routing protocol (CERP) on wireless sensor networks
Divakar et al. Energy Optimization in Wireless Sensor Network using Clustering and PSO Algorithm
Chaubey et al. An Efficient Cluster Based Energy Routing Protocol (E-CBERP) for Wireless Body Area Networks Using Soft Computing Technique
Beydoun et al. Energy-efficient WSN infrastructure
Al-Turjman Cognitive-node architecture and a deployment strategy for the future sensor networks 1
Abbi et al. Analysis and Clustering of Sensor Recorded Data to Determine Sensors Consuming the Least Energy
Mustafa Multiple Criteria Decision-Making Based Clustering Technique for WSNs
Sadouq et al. Hybrid Techniques to Conserve Energy in WSN
Dankan Gowda et al. Convergence of Communication Technologies with Internet of Things
Sikka et al. An Overview of Wireless Sensors Networks
Diratie Hybrid internet of things network for energy-efficient video surveillance system
Merga Energy Efficiency in Data Dissemination Protocols of Wireless Sensor Networks
Gupta et al. Literature Survey on Data Aggregation Techniques Using Mobile Agent and Trust-Aware in Wireless Sensor Network
Sekhar A distance based sleep schedule algorithm for enhanced lifetime of heterogeneous wireless sensor networks

Legal Events

Date Code Title Description
AS Assignment

Owner name: BOEING COMPANY, THE, ILLINOIS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AYYAGARI, ARUN;UNG, KEVIN Y.;BLAIR, RICK;AND OTHERS;REEL/FRAME:017679/0126;SIGNING DATES FROM 20051122 TO 20060125

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION