US20090192921A1 - Methods and apparatus to survey a retail environment - Google Patents
Methods and apparatus to survey a retail environment Download PDFInfo
- Publication number
- US20090192921A1 US20090192921A1 US12/019,280 US1928008A US2009192921A1 US 20090192921 A1 US20090192921 A1 US 20090192921A1 US 1928008 A US1928008 A US 1928008A US 2009192921 A1 US2009192921 A1 US 2009192921A1
- Authority
- US
- United States
- Prior art keywords
- image
- cart
- images
- retail establishment
- location information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/087—Inventory or stock management, e.g. order filling, procurement or balancing against orders
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0312—Detection arrangements using opto-electronic means for tracking the rotation of a spherical or circular member, e.g. optical rotary encoders used in mice or trackballs using a tracking ball or in mouse scroll wheels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0317—Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
Definitions
- the present disclosure relates generally to consumer monitoring and, more particularly, to methods and apparatus to survey a retail environment.
- Retail establishments and product manufacturers are often interested in the shopping activities, behaviors, and/or habits of people in a retail environment.
- Consumer activity related to shopping can be used to correlate product sales with particular shopping behavior and/or to improve placements of products, advertisements, and/or other product-related information in a retail environment.
- Known techniques for monitoring consumer activities in retail establishments include conducting surveys, counting patrons, and/or conducting visual inspections of shoppers or patrons in the retail establishments. Such techniques are often developed by a market research entity based on products and/or services offered in the retail establishment.
- the names of products and/or services available in a retail establishment can be obtained from store inventory lists developed by retail employees. However, such inventory lists may not include locations of items in the retail establishment to be able to associate a consumer's activity in a particular location with particular products at that location.
- FIG. 1 illustrates a plan view of an example retail establishment having a plurality of product category zones.
- FIG. 2 illustrates an isometric view of an example surveying cart that may be used to implement the example methods and apparatus described herein to survey the retail establishment of FIG. 1 .
- FIG. 3 depicts a rear view of the example surveying cart of FIG. 2 .
- FIG. 4 illustrates an example walk-through path in the example retail establishment of FIG. 1 that may be used to perform a survey of the retail establishment.
- FIG. 5 depicts products placed on a shelving system of the example retail establishment of FIGS. 1 and 4 .
- FIGS. 6A and 6B depict example photographic images of the shelving system and products of FIG. 5 captured in succession using the example surveying cart of FIG. 2 .
- FIGS. 7A and 7B depict the example photographic images of FIGS. 6A and 6B having discard areas indicative of portions of the photographic images to be discarded prior to a stitching process.
- FIGS. 8A and 8B depict cropped photographic image portions of the example photographic images of FIGS. 6A , 6 B, 7 A, and 7 B useable for an image stitching process.
- FIG. 9 depicts an example stitched photographic image composition formed using the example cropped photographic image portions of FIGS. 8A and 8B .
- FIG. 10 is an example navigation assistant graphical user interface (GUI) that may be used to display cart speed status of the example cart of FIG. 2 to assist a person in pushing the cart around the retail environment of FIG. 1 .
- GUI navigation assistant graphical user interface
- FIG. 11 is an example graphical user interface that may be used to display photographic images and receive user input associated with categorizing the photographic images.
- FIG. 12 is a block diagram of an example apparatus that may be used to implement the example methods described herein to perform product surveys of retail establishments.
- FIG. 13 is a flow diagram of an example method that may be used to collect and process photographic images of retail establishment environments.
- FIG. 14 is a flow diagram of an example method that may be used to merge images of products displayed in a retail establishment to generate merged, stitched, and/or panoramic images of the displayed products.
- FIG. 15 is a flow diagram depicting an example method that may be used to process user input information related to the photographic images collected and processed in connection with the example method of FIGS. 13 and 14 .
- FIG. 16 is a block diagram of an example processor system that may be used to implement some or all of the example methods and apparatus described herein.
- FIG. 17 is a partial view of a cart having a light source and an optical sensor to implement an optical-based dead reckoning system to determine location information indicative of locations traversed by the cart in a retail establishment.
- FIG. 18 is an example panoramic image formed using numerous captured images of products displayed in a retail establishment.
- the example methods and apparatus described herein may be used to survey products in a retail establishment.
- the example methods and apparatus may be used to determine the types of products and their locations in a retail establishment to generate a product layout or map of the retail establishment.
- the product layout can then be used in connection with, for example, consumer behavior monitoring systems and/or consumer surveys to enable product manufacturers to better understand shoppers and how to reach and influence shoppers that buy goods in retail establishments.
- the in-store product layout can be used to determine when products were on shelves so that shoppers could have been exposed to those products to have the opportunity to purchase those products.
- the example methods and apparatus described herein can be used to generate product layout maps that can be correlated with purchasing histories to determine how those product layouts affected consumer purchases.
- the information about the types of products in retail establishments can be used to confirm that products are temporally and spatially correctly placed in the retail establishments.
- the example methods and apparatus described herein can be implemented using a mobile cart having wheels and cameras mounted thereto.
- a survey person can push the mobile cart through a retail establishment (e.g., through product aisles, through checkout lanes, through storefront areas, etc.) as the cameras capture photographs of products placed in the surrounding areas.
- a retail establishment may be partitioned into multiple areas of interest (e.g., category-based areas, product aisles, etc.). Sequentially captured photographic images for each area of interest are then stitched to form a uniform, continuous panoramic photographic image of those areas of interest.
- Identifiers for the stitched photographic images can be stored in a database in association with information about products placed in the areas corresponding to those stitched photographic images.
- the cart used to capture the photographic images may be provided with an application having a user interface to display the in-store photographic images and receive user inputs.
- an application can be provided at a computer system separate from the cart.
- the example methods and apparatus described herein may be implemented using any suitable image type including, for example, photographic images captured using a digital still camera, still-picture or freeze-frame video images captured from a video stream, or any other type of suitable image. For purposes of discussion, the example methods and apparatus are described herein as being implemented using photographic images.
- an example retail establishment 100 includes a plurality of product category zones 102 a - h .
- the retail establishment 100 is a grocery store.
- the example methods and apparatus described herein can be used to survey product layouts in other types of retail establishments (e.g., department stores, clothing stores, specialty stores, hardware stores, etc.).
- the product category zones 102 a - h are assigned sequential numerical values and include a first zone ( 1 ) 102 a , a second zone ( 2 ) 102 b , a third zone ( 3 ) 102 c , a fourth zone ( 4 ) 102 d , a fifth zone ( 5 ) 102 e , a sixth zone ( 6 ) 102 f , a seventh zone ( 7 ) 102 g , and an eighth zone ( 8 ) 102 h .
- a zone is an area of a retail establishment in which a shopper can be expected to have the opportunity to be exposed to products. The boundaries of a zone may relate to product layout throughout the retail establishment and/or natural boundaries that a person could relatively easily perceive.
- zones are created based on the types of products that are sold in particular areas of a retail establishment.
- the first zone ( 1 ) 102 a corresponds to a checkout line category
- the second zone ( 2 ) 102 b corresponds to a canned goods category
- the third zone ( 3 ) 102 c corresponds to a frozen foods category
- the fourth zone ( 4 ) 102 d corresponds to a household goods category
- the fifth zone ( 5 ) 102 e corresponds to a dairy category
- the sixth zone ( 6 ) 102 f corresponds to a meats category
- the seventh zone ( 7 ) 102 g corresponds to a bakery category
- the eighth zone ( 8 ) 102 h corresponds to a produce category.
- a department store may have other types of zones in addition to or instead of the category zones 102 a - h of FIG. 1 that may include, for example, a women's clothing zone, a men's clothing zone, a children's clothing zone, a household appliance zone, an automotive hardware zone, a seasonal items zone, a pharmacy zone, etc.
- surveys of retail establishments may be conducted as described herein without using zones.
- the retailer may provide a map showing store layout characteristics.
- the map can be scanned into a database configured to store scanned maps for a plurality of other monitored retail establishments.
- the retailer can also provide a planogram, which is a diagram, a drawing, or other visual description of a retail establishment's layout, including placement of particular products and product categories. If the retailer cannot provide such information, an audit can be performed of the retailer's establishment by performing a walk through and collecting information indicative of products, product categories, and placements of the same throughout the retail establishment.
- a category zone map e.g., the plan view of the retail establishment 100 of FIG.
- the 1 can be created by importing a scanned map and a planogram or other similar information (e.g., audit information) and adding the category zone information (e.g., the category zones 102 a - h of FIG. 1 ) to the map based on the planogram information (or similar information).
- a planogram or other similar information e.g., audit information
- category zone information e.g., the category zones 102 a - h of FIG. 1
- each of the category zones 102 a - h is created based on a shopper's point of view (e.g., a shopper's exposure to different areas as the shopper moves throughout the retail establishment).
- a shopper's point of view e.g., a shopper's exposure to different areas as the shopper moves throughout the retail establishment.
- the store survey information collected using the example methods and apparatus described herein can be used to make correlations between shoppers' locations in the retail establishment and the opportunity those shoppers had to consume or be exposed to in-store products.
- a category zone can be created based on a shopper's line of sight when walking down a particular aisle.
- the category zones can also be created based on natural boundaries throughout a retail establishment such as, for example, changes in floor tile or carpeting, visual obstructions, enclosed areas such as greeting card centers, floral centers, and garden centers.
- FIG. 2 is an isometric view and FIG. 3 is a rear view of an example surveying cart 200 that may be used to perform surveys of retail establishments (e.g., the example retail establishment 100 of FIG. 1 ).
- the example surveying cart 200 includes a base 202 having a front side 204 , a rear side 206 , and two peripheral sides 208 and 210 .
- the surveying cart 200 includes wheels 212 a - b rotatably coupled to the base 202 to facilitate moving the cart 200 throughout a retail establishment (e.g., the retail establishment 100 of FIG. 1 ) during a survey process.
- a caster 214 is coupled to the front side 204 (but in other example implementations may be coupled to the rear side 206 ) of the base 202 .
- the example surveying cart 200 also includes a handle 216 coupled to the rear side 206 to facilitate pushing the cart 200 throughout a retail establishment.
- each of the wheels 212 a - b is independently rotatably coupled to the base 202 via respective arbors 217 a - b as shown in FIG. 3 to enable each of the wheels 212 a - b to rotate independently of the other when, for example, a user pushes the cart 200 in a turning or swerving fashion (e.g., around a corner, not in a straight line, etc.).
- each of the wheels 212 a - b is operatively coupled to a respective rotary encoder 218 a - b .
- the rotary encoders 218 a - b may alternatively be implemented using any other suitable sensors to detect speed and/or travel distance.
- the wheels 212 a - b can be implemented using a soft rubber material creating sufficient friction with floor surface materials (e.g., tile, ceramic, concrete, sealant coatings, etc.) so that the wheels 212 a - b do not slip when the cart 200 is pushed throughout retail establishments.
- the rotary encoders 218 a - b can also be used to implement a wheel-based dead reckoning system to determine the locations traveled by the cart 200 throughout the retail establishment 100 .
- Independently rotatably coupling the wheels 212 a - b to the base 202 enables using the differences between the travel distance measured by the rotary encoder 218 a and the travel distance measured by the rotary encoder 218 b to determine when the cart 200 is turning or is not proceeding in a straight line.
- two cameras 220 a and 220 b are mounted on the surveying cart 200 in an outwardly facing configuration so that the cameras 220 a - b have a field of view substantially opposing the peripheral sides 206 and 208 of the surveying cart 200 .
- Each of the cameras 220 a - b may be implemented using a digital still camera, a video camera, a web camera, or any other suitable type of camera.
- the cameras 220 a - b may be implemented using high-quality (e.g., high pixel count) digital still cameras to capture high quality photographic images to facilitate accurate optical character recognition and/or image object recognition processing of the captured photographic images.
- the cameras 220 a - b are mounted to the cart 200 so that their fields of view are in substantially perpendicular configurations relative to the direction of travel of the cart 200 .
- a shutter trigger signal of each camera may be controlled based on the movement of the wheels 212 a - b .
- the cart 200 may be configured to trigger the cameras 220 a - b to capture an image each time the wheels 212 a - b rotate a particular number of times based on signals output by one or both of the encoders 218 a - b . In this manner, the image capturing operations of the cameras 212 a - b can be automated based on the travel distance of the cart 200 .
- the example surveying cart 200 is provided with a display 222 .
- the display 222 is equipped with a touchscreen interface to enable users to interact with applications using a stylus 224 .
- Example graphical user interfaces that may be presented on the display 222 in connection with operations of the example surveying cart 200 are described below in connection with FIGS. 10 and 11 .
- the example cart 200 is provided with range sensors 226 a and 226 b mounted on the peripheral sides 208 and 210 .
- Each of the sensors 226 a and 226 b is mounted in an outwardly facing configuration and is exposed through a respective aperture (one of which is shown in FIG. 2 and designated by numeral 228 ) in one of the peripheral sides 208 and 210 to measure distances to objects adjacent to the cart 200 .
- the cart 200 could be provided with two or more range sensors on each of the peripheral sides 208 and 210 to enable detecting products placed at different heights on product shelves, product racks, or other product furniture.
- the sensor 226 a may measure an invalid or incorrect distance or range, but another range sensor mounted lower on the cart 200 as indicated in FIGS. 2 and 3 by a phantom line and reference numeral 227 can measure the distance or range to the product placed lower than the height of the range sensor 226 a .
- Any number of range sensors substantially similar or identical to the range sensors 226 a - b can be provided on each of the peripheral sides 208 and 210 of the cart 200 .
- FIG. 4 illustrates an example walk-through survey path 400 in the example retail establishment 100 of FIG. 1 that may be used to perform a survey of the retail establishment 100 using the example surveying cart 200 of FIGS. 2 and 3 .
- a person can push the surveying cart 200 through the retail establishment 100 in a path generally indicated by the walk-through survey path 400 while the surveying cart 200 captures successive photographic images of products placed on shelves, stands, racks, refrigerators, freezers, etc.
- the surveying cart 200 captures photographic images or after the surveying cart 200 has captured all of the photographic images of the retail establishment 100 , the surveying cart 200 or another post-processing system (e.g., a post-processing system located at a central facility) can stitch or merge corresponding successive photographic images to create continuous panoramic photographic images of product display units (e.g., shelves, stands, racks, refrigerators, freezers, etc.) arranged in respective aisles or zones. Each stitched, merged, or otherwise compiled photographic image can subsequently be used during an analysis phase to determine placements of products within the retail establishment 100 and within each of the category zones 102 a - h ( FIG. 1 ) of the retail establishment 100 .
- the survey path 400 proceeds along peripheral areas of the retail establishment 100 and then through aisles. However, other survey paths that proceed along different routes and zone orderings may be used instead.
- the range sensors 226 a - b measure distances at range measuring points 402 along the path 400 .
- both of the range sensors 226 a - b measure distances on respective sides of the cart 200 .
- only a corresponding one of the sensors 226 a - b may measure a distance.
- the distance measurements can be used to measure the widths and overall sizes of shopping areas (e.g., aisle widths, aisle length and/or area size, etc.) and/or category zones.
- FIG. 5 depicts an arrangement of products 502 placed on a shelving system 504 of the example retail establishment 100 of FIGS. 1 and 4 .
- the arrangement of products 502 is used to illustrate an example technique that may be used to capture successive photographic images of products throughout the retail establishment 100 and stitch or merge the photographic images to form a compilation of successively captured photographic images as a unitary continuous panoramic photographic image depicting products arranged on a product display unit (e.g., shelves, stands, racks, refrigerators, freezers, etc.) of a corresponding aisle or zone.
- a product display unit e.g., shelves, stands, racks, refrigerators, freezers, etc.
- FIGS. 6A and 6B when the cart 200 captures photographic images of the arrangement of products 502 , it does so by capturing two successive photographic images, one of which is shown in FIG. 6A and designated as image A 602 and the other of which is shown in FIG. 6B and designated as image B 652 .
- Image A 602 corresponds to a first section 506 ( FIG. 5 ) of the arrangement of products 502
- image B 652 corresponds to a second section 508 ( FIG. 5 ) of the arrangement of products 502 .
- a merging or stitching process is used to join image A 602 and image B 652 along an area that is common to both of the images 602 and 652 .
- FIG. 7A shows peripheral areas 604 and 606 of image A 602 and FIG. 7B shows peripheral areas 654 and 656 of image B 654 that are identified as areas to be discarded.
- These areas 604 , 606 , 654 , and 656 are discarded because of a parallax effect in these areas due to lens radial distortion created by the radius of curvature or rounded characteristics of the camera lenses used in connection with the cameras 220 a - b of FIGS. 2 and 3 .
- the parallax effect makes objects in the peripheral areas 604 , 606 , 654 , and 656 appear shifted relative to objects at the central or middle portions of the photographic images 602 and 652 .
- the parallax effect in the remaining peripheral portions (e.g., the peripheral merge areas 902 and 904 of FIG. 9 ) of the images 602 and 652 used to merge the images 602 and 652 is substantially reduced or eliminated.
- FIG. 9 depicts an example stitched or merged photographic image composition 900 formed using the example cropped photographic images 802 and 852 of FIGS. 8A and 8B .
- merge areas 902 and 904 are identified in the cropped photographic images 802 and 852 as having corresponding, overlapping edges and/or image objects based on the ones of the products 502 appearing in those areas 902 and 904 . Identifying the merge areas 902 and 904 enables creating the stitched or merged photographic image composition 900 by joining (e.g., overlapping, integrating, etc.) the cropped photographic images 802 and 852 at the merge areas 902 and 904 .
- numerous photographic images of a product display unit can be merged to form a panoramic image of that product display unit such as, for example, a panoramic image 1800 of FIG. 18 .
- the panoramic image 1800 is formed by merging the photographic images 802 , 852 , 1802 , and 1804 as shown.
- the photographic images 1802 and 802 are merged at merge area 1806
- the photographic images 802 and 852 are merged at merge area 1808
- the photographic images 852 and 1804 are merged at merge area 1810 .
- any number of photographs may be merged to form a panoramic image of products on display in a retail establishment.
- FIG. 10 is an example navigation assistant graphical user interface (GUI) 1000 that may be used to display cart speed status of the example cart 200 ( FIG. 2 ) to assist a person in pushing the cart 200 in or around a retail environment (e.g., the retail environment 100 of FIG. 1 ).
- GUI navigation assistant graphical user interface
- the navigation assistant GUI 1000 includes a path of travel display area 1002 to display a path of travel plot 1004 indicative of the locations traversed by the cart 200 during a survey.
- the path of travel plot 1004 is generated based on location information determined using travel distance information generated by the encoders 218 a - b .
- the path of travel plot 1004 can be generated using filtering algorithms, averaging algorithms or other signal processing algorithms to make the path of travel plot 1004 relatively more accurate, smooth, and/or consistent.
- the path of travel display area 1002 is also used to display a store layout map 1006 .
- the store layout map 1006 may be indicative of the locations of store furniture (e.g., shelves, counters, stands, etc.) and/or product category zones, and the survey information collected using the example methods and apparatus described herein can be used to determine locations of particular products, advertisements, etc. in the layout map 1006 .
- the store layout map 1006 may not be displayed. For example, a store layout map of a store being surveyed may not yet exist, but the survey information collected as described herein may subsequently be used to generate a store layout map.
- the navigation assistant GUI 1000 is provided with a notification area 1008 to display guidance messages on whether a user should decrease the speed of the cart 200 . Also, the navigation assistant GUI 1000 is provided with a speedometer display 1010 . As a user pushes the cart 200 , the user should attempt to keep the speed of the cart 200 lower than a predetermined maximum speed. An acceptable speed may be predetermined or preselected based on one or more criteria including, a camera shutter speed, an environment lighting speed, the size of the retail environment 100 to be surveyed within a given duration, etc.
- the navigation assistant GUI 1000 is provided with a location display area 1012 .
- the location information displayed in the location display area 1012 can be generated using location generation devices or location receiving devices of the cart 200 .
- the location display area 1012 displays Cartesian coordinates (X, Y), but may alternatively be used to display other types of location information.
- the navigation assistant GUI 1000 is provided with an image captured counter 1014 .
- the navigation assistant GUI 1000 is provided with an initialize button 10016 .
- a user may initialize the cart 200 by positioning the cart 200 to face a direction that is in accordance with the orientation of the store layout map 1006 shown in the path of travel display area 1002 of FIG. 10 .
- the notification area 1008 can be used to display the direction in which the cart 200 should initially be facing before beginning a survey.
- the initial direction information displayed in the notification area 1008 can be displayed as store feature information and can include messages such as, for example, face the rear wall of the store, face the front windows of the store, etc.
- FIG. 11 is an example categorization graphical user interface (GUI) 1100 that may be used to display photographic images and receive user input associated with categorizing the photographic images.
- GUI graphical user interface
- a person can use the categorization GUI 1100 during or after performing a survey of a retail establishment to retrieve and navigate between the various captured photographic images and tag those images with data pertaining to zones of a store (e.g., the zones 102 a - h of FIG. 1 ).
- the example categorization GUI 1100 and its related operations can be implemented using a processor system (e.g., a computer, a terminal, a server, etc.) at a central facility or some other post processing site to which the survey data collected by the cart 200 is communicated.
- the cart 200 may be configured to implement the example categorization GUI 1100 and its related operations.
- the categorization GUI 1100 is provided with a ‘select store’ menu 1102 via which a person can select the retail establishment for which the person would like to analyze photographic images.
- the categorization GUI 1100 is provided with an image display area 1104 .
- the displayed photographic image is a merged photographic image (e.g., the merged photographic image 900 of FIG. 9 ) while in other example implementations, the displayed photographic image is not a merged photographic image (e.g., one of the photographic images 602 or 652 of FIGS. 6A and 6B ).
- To display location information indicative of a location within a retail environment e.g., the retail environment 100 of FIG.
- the categorization GUI 1100 is provided with a location display area 1106 .
- the location display area 1106 displays Cartesian coordinates (X, Y), but may alternatively be used to display other types of location information.
- the categorization GUI 1100 also includes a zone tags drop down list 1108 that is populated with a plurality of zones created for the retail establishment associated with the retrieved photographic image. A person can select a zone from the zone tags drop down list 1108 corresponding to the photographic image displayed in the image display area 1104 to associate the selected zone identifier with the displayed photographic image.
- the categorization GUI 1100 is provided with a product codes selection control 1110 .
- a person may select the product codes associated with the products shown in the displayed photographic image to associate the selected product codes with the displayed photographic image and the zone selected in the zone tags drop down list 1108 .
- the person may drag and drop zone tags and/or product codes from the zone tags drop down list 1108 and/or the product codes selection control 1110 to the image display area 1104 to associate those selected zone tags and/or product codes with the displayed photographic image.
- product codes in the product code selection control 1110 can be selected automatically using a character recognition and/or an image recognition process used to recognize products (e.g., types of products, product names, product brands, etc.) in images. That is, after the character and/or image recognition process detects particular product(s) in the image display area 1104 , one or more corresponding product codes can be populated in the zone tags drop down list 1108 based on the product(s) detected using the recognition process.
- a character recognition and/or an image recognition process used to recognize products (e.g., types of products, product names, product brands, etc.) in images. That is, after the character and/or image recognition process detects particular product(s) in the image display area 1104 , one or more corresponding product codes can be populated in the zone tags drop down list 1108 based on the product(s) detected using the recognition process.
- the categorization GUI 1100 is provided with an add product code field 1112 .
- the person may add the product code for the new product in the add product code field 1112 .
- the categorization GUI 1100 can be configured to subsequently display the newly added product code in the product codes selection control 1110 .
- FIG. 12 is a block diagram of an example apparatus 1200 that may be used to implement the example methods described herein to perform product surveys of retail establishments (e.g., the retail establishment 100 of FIG. 1 ).
- the example apparatus 1200 may be implemented using any desired combination of hardware, firmware, and/or software. For example, one or more integrated circuits, discrete semiconductor components, and/or passive electronic components may be used. Additionally or alternatively, some or all of the blocks of the example apparatus 1200 , or parts thereof, may be implemented using instructions, code, and/or other software and/or firmware, etc. stored on a machine accessible medium that, are executed by, for example, a processor system (e.g., the example processor system 1610 of FIG. 16 ).
- a processor system e.g., the example processor system 1610 of FIG. 16 .
- the example apparatus 1200 is provided with a speed detector interface 1202 .
- the speed detector interface 1202 may receive rotary encoding information from the rotary encoders 218 a - b and generate first speed information indicative of the speed of the first wheel 212 a and second speed information indicative of the speed of the second wheel 212 b based on that received information.
- the speed detector interface 1202 can receive the speed information from the encoders 218 a - b for each of the wheels 212 a - b .
- the speed detector interface 1202 can use averaging operations to process the speed information for each wheel 212 a - b for display to a user via, for example, the navigation assistant GUI 1000 of FIG. 10 .
- the example apparatus 1200 is provided with a speed monitor 1204 .
- the speed monitor 1204 is configured to monitor the speed information generated by the speed detector interface 1202 to determine whether the cart 200 is moving too fast during a product survey.
- a speed indicator value generated by the speed monitor 1204 can be used to present corresponding messages in the notification area 1008 of FIG. 10 to notify a person pushing the cart 200 whether to decrease the speed of the cart 200 or to keep moving at the same pace.
- the example apparatus 1200 is provided with a range detector interface 1206 .
- the range detector interface 1206 is configured to receive distance information from the range sensors 226 a - b at, for example, each of the range measuring points 402 depicted in FIG. 4 .
- the distance information may be used to determine the distances between each of the cameras 220 a - b and respective target products photographed by the cameras 220 a - b.
- the example apparatus 1200 To receive photographic images from the cameras 220 a - b , the example apparatus 1200 is provided with an image capture interface 1208 . To store data (e.g., photographic images, zone tags, product codes, location information, speed information, notification messages, etc.) in a memory 1228 and/or retrieve data from the memory 1228 , the example apparatus 1200 is provided with a data interface 1210 . In the illustrated example, the data interface 1210 is also configured to transfer survey data from the cart 200 to a post-processing system (e.g., the post processing system 1221 described below).
- a post-processing system e.g., the post processing system 1221 described below.
- the example apparatus 1200 is provided with a location information generator 1212 .
- the location information generator 1212 can be implemented using, for example, a dead reckoning system implemented using the speed detector interface 1202 and one or more motion detectors (e.g., an accelerometer, a gyroscope, etc.).
- the location information generator 1212 is configured to receive speed information from the speed detector interface 1202 for each of the wheels 212 a - b of the cart 200 . In this manner, the location information generator 1212 can monitor when and how far the cart 200 has moved to determine travel distances of the cart 200 .
- the location information generator 1212 can analyze the respective speed information of each of the wheels 212 a and 212 b to detect differences between the rotational speeds of the wheels 212 a - b to determine when the cart 200 is turning or swerving. For example, if the rotational speed of the left wheel is relatively slower than the rotational speed of the right wheel, the location information generator 1212 can determine that the cart 200 is being turned in a left direction.
- the rotary encoder 218 a - b may not be completely accurate (e.g., encoder output data may exhibit some drift) and/or the wheels 212 a - b may occasionally lose traction with a floor and slip, thereby, preventing travel information of the cart 200 from being detected by the rotary encoders 218 a - b .
- the location information generator 1212 can use motion information generated by one or more motion detectors (e.g., an accelerometer, a gyroscope, etc.) as reference information to determine if correction to location information generated based on wheel speeds should be corrected or adjusted.
- wheel speed information can be used to generate relatively more accurate travel distance and location information than using motion detectors alone, when wheel slippage or rotational encoder inaccuracies occur, the motion sensor(s) continuously output movement information as long as the cart 200 is moving, and such motion sensor information can be used to make minor adjustments to the travel distance and/or location information derived using the wheel speed information.
- the location information generator 1212 can be implemented using an optical-based dead reckoning system that detects travel distances and turning or swerving by the cart 200 using a light source and an optical sensor.
- the location information generator 1212 can be communicatively coupled to a light source 1702 and an optical sensor 1704 (e.g., a black and white complimentary metal-oxide semiconductor (CMOS) image capture sensor) mounted to the bottom of a cart 1700 .
- CMOS complimentary metal-oxide semiconductor
- the cart 1700 is substantially similar or identical to the cart 200 except for the addition of the light source 1702 and the optical sensor 1704 .
- the rotational encoders 218 a - b can be omitted from the cart 1700 because the light source 1702 and the optical sensor 1704 would provide travel distance and turning or swerving information.
- the light source 1702 is used to illuminate an area 1706 of floor or surface on which the cart 1700 travels and the optical sensor 1704 captures successive images of an optical capture area 1708 on the surface that are used to determine the speed and directions of travel of the cart 1700 .
- paths of travel e.g., the paths of travel 400 of FIG. 4 and/or 1004 of FIG.
- the location information generator 1212 can be configured to perform an optical flow algorithm that compares the images successively captured by the optical sensor 1704 to one another to determine motion, direction, and the speed of travel of the cart 1700 .
- the optical flow algorithm is well known in the art and, thus, is not described in greater detail.
- the location information generator 1212 can also receive camera-to-product distance information from the range detector interface 206 to determine where in a store aisle between two product racks the cart 200 is positioned. This information may be used to display a store layout map in a graphical user interface similar to the store layout of FIG. 4 and display a path of travel on the store layout map to show a user where in the store the user is moving the cart 200 .
- the location information generated by the location information generator 1212 can be associated with respective photographic images captured by the cameras 220 a - b . In this manner, the location information for each photographic image can be displayed in, for example, the location information area 1106 of FIG. 11 .
- the location information generator 1212 is described as being implemented using a dead reckoning device, any other location information generation or collection technologies can alternatively be used to implement the location information generator 1212 .
- the example apparatus 1200 is provided with a travel path generator 1214 .
- the path of travel information can be used to generate a path of travel through a retail establishment for display to a user while performing a product survey as, for example, described above in connection with FIG. 10 .
- the example apparatus 1200 is provided with an image features detector 1216 .
- the image features detector 1216 can be used to recognize products (e.g., types of products, product names, product brands, etc.) in images in connection with, for example, the image categorization GUI 1100 for use in associating product codes in the product codes selection control 1110 with photographic images.
- the image features detector 1216 can also be configured to identify the merge areas 902 and 904 of FIG. 9 to merge the cropped images 802 and 852 .
- the example apparatus 1200 is provided with an image cropper 1218 .
- the image cropper 1218 may crop the peripheral areas 604 , 606 , 654 and 656 of the photographic images 602 and 652 to produce the cropped photographic images 802 and 852 .
- the example apparatus 1200 is provided with an image merger 1220 .
- the image merger 1220 can be used to merge the cropped images 802 and 852 at the merge areas 902 and 904 to form the merged or stitched image compilation 900 .
- the image features detector 1216 , the image cropper 1218 , and the image merger 1220 can be omitted from the example apparatus 1200 and can instead be implemented as a post processing system 1221 located at a central facility or at some other post processing site (not shown).
- the apparatus 1200 can upload or communicate the images to the post processing system 1221 , and the post processing system 1221 can process the images to form the stitched or merged panoramic photographic images.
- the example apparatus 1200 is provided with a display interface 1222 .
- the display interface may be used to generate and display the navigation assistant GUI 1000 of FIG. 10 and the image categorization GUI 1100 of FIG. 11 .
- the example display interface 1222 may be used to generate and display a layout map of a surveyed retail establishment and a real-time path of travel of the cart 200 as the cart 200 is moved throughout the surveyed retail establishment.
- the example apparatus 1200 To associate zone information (e.g., the zone tags of the zone tags drop down list 1108 of FIG. 11 ) with corresponding captured photographic images (e.g., photographic images displayed in the image display area 1104 of FIG. 11 ), the example apparatus 1200 is provided with a zone associator 1224 .
- zone information e.g., the zone tags of the zone tags drop down list 1108 of FIG. 11
- captured photographic images e.g., photographic images displayed in the image display area 1104 of FIG. 11
- product code information e.g., the product codes of the product codes selection control 1110 of FIG. 11
- the example apparatus 1200 To receive user selections of zone tags and product codes, the example apparatus 1200 is provided with a user input interface 1230 .
- FIGS. 13 , 14 , and 15 depict flow diagrams of example methods that may be used to collect and process photographic images of retail establishment environments.
- the example methods of FIGS. 13 , 14 , and 15 are described as being implemented using the example apparatus 1200 .
- the example methods of FIGS. 13 , 14 , and 15 may be implemented using machine readable instructions comprising one or more programs for execution by a processor (e.g., the processor 1612 shown in the example processor system 1610 of FIG. 16 ).
- the program(s) may be embodied in software stored on one or more tangible media such as CD-ROM's, a floppy disks, hard drives, digital versatile disks (DVD's), or memories associated with a processor system (e.g., the processor system 1610 of FIG. 16 ) and/or embodied in firmware and/or dedicated hardware in a well-known manner.
- a processor system e.g., the processor system 1610 of FIG. 16
- firmware and/or dedicated hardware in a well-known manner.
- the cart 200 ( FIGS. 2 and 3 ) is initialized (block 1302 ).
- an initial location of the cart 200 in the retail establishment 100 can be set in the location information generator 1212 to its known location (e.g., an initial reference location) to generate subsequent location information using dead reckoning techniques.
- a user may initialize the cart 200 by positioning the cart 200 to face a direction that is in accordance with the orientation of the store layout map 1006 shown in the path of travel display area 1002 of FIG. 10 and/or in accordance with direction information displayed in the notification area 1008 of FIG. 10 .
- the user can select the initialize button 1016 to set a current location of the cart 200 to zero, and the cart 200 can subsequently generate location information relative to the zeroed initial location.
- the speed detector interface 1202 measures a speed of the cart 200 (block 1306 ).
- the speed detector interface 1202 can receive information from the rotary encoders 218 a - b and can generate speed information for each of the wheels 212 a - b (and/or an average speed of both of the wheels 212 a - b ) based on the received rotary encoder information.
- the display interface 1222 then displays the speed information (block 1308 ) via the display 222 ( FIG. 2 ).
- the display interface 1222 can display the speed information via the speedometer display 1010 ( FIG. 10 ).
- the speed monitor 1204 determines whether the speed of the cart 200 is acceptable (block 1310 ). For example, the speed monitor 1204 may compare the speed generated at block 1306 with a speed threshold or a speed limit (e.g., a predetermined maximum speed threshold) to determine whether the cart 200 is moving at an acceptable speed.
- a speed threshold or a speed limit e.g., a predetermined maximum speed threshold
- An acceptable speed may be predetermined or preselected based on one or more criteria including, a camera shutter speed, an environment lighting speed, the size of the retail environment 100 to be surveyed within a given duration, etc.
- the speed monitor 1204 causes the display interface 1222 to display textual and/or color-coded speed feedback indicators to inform a user to improve the speed of the cart 200 (block 1312 ).
- the speed monitor 1204 may cause the speed interface 1222 to display a notification message in the notification area 1008 ( FIG. 10 ) to decrease the speed of the cart 200 .
- the image capture interface 1208 receives and stores successively captured photographic images (e.g., the photographic images 602 and 652 of FIGS. 6A and 6B ) from each of the cameras 220 a - b (block 1314 ).
- the image capture interface 1208 may be configured to trigger the camera 220 a - b to capture photographic images at periodic intervals which may be based on a distance traveled by the cart 200 .
- the image capture interface 1208 may obtain the distance traveled by the cart from the speed detector interface 1202 and/or from the location information generator 1212 .
- the distance traveled by the cart 200 may be provided in linear measurement units (e.g., inches, feet, yards, etc.) or may be provided in encoding units generated by the rotary encoders 218 a - b .
- the image capture interface 1208 then tags each of the photographic images with a respective photo identifier (block 1316 ).
- the location information generator 1212 collects (or generates) location information corresponding to the location of the cart 200 when each of the photographic images was captured at block 1314 (block 1318 ).
- the data interface 1210 then stores the location information generated at block 1318 in association with each respective photo identifier (block 1320 ) in, for example, the memory 1228 .
- the example apparatus 1200 determines whether it should continue to acquire photographic images (block 1322 ). For example, if the product survey is not complete, the example apparatus 1200 may determine that it should continue to acquire photographic images (block 1322 ), in which case control is returned to block 1306 . Otherwise, if the product survey is complete, the example apparatus 1200 may determine that it should no longer continue to acquire photographic images (block 1322 ).
- the data interface 1210 communicates the stored images, location information, and photo identifiers to the post processing system 1221 ( FIG. 12 ) (block 1324 ), and the post processing system 1221 merges the images (block 1326 ) to form panoramic images of product displays.
- An example process that may be used to implement the example image merging process of block 1326 is described below in connection with FIG. 14 .
- the example process of FIG. 13 is then ended.
- the image merging process of block 1326 is described as being performed by the post processing system 1221 separate from the apparatus 1200 that is implemented on the cart 200 , in other example implementations, the image merging process of block 1326 can be performed by the example apparatus 1200 at the cart 200 .
- the post processing system 1221 selects photographs to be merged (block 1402 ).
- the post processing system can select the photographic images 602 and 652 of FIGS. 6A and 6B .
- the image features detector 1216 locates the edge portions of the photographic images to be merged (block 1404 ).
- the image features detector 1216 can locate the peripheral areas 604 , 606 , 654 , and 656 of the photographic images 602 and 652 based on a predetermined edge portion size to be cropped.
- the image cropper 1218 ( FIG. 12 ) can then discard the edge portions (block 1406 ) identified at block 1402 .
- the image cropper 1218 can discard the edge portions 604 , 606 , 654 , and 656 to form the cropped images or photographic images 802 and 852 of FIGS. 8A and 8B .
- the image features detector 1216 then identifies merge areas in the cropped photographic images 802 and 852 (block 1408 ) generated at block 1404 .
- the image features detector 1216 can identify the merge areas 902 and 904 of FIG. 9 based on having corresponding, overlapping edges and/or image objects based on the ones of the products 502 appearing in those areas 902 and 904 .
- the image merger 1220 then overlays the cropped photographic images 802 and 852 at the merge areas 902 and 904 (block 1410 ) and merges the cropped photographic images 802 and 852 (block 1412 ) to create the merged or stitched photographic image composition 900 of FIG. 9 .
- the post processing system 1221 then stores the merged photographic image 900 in a memory (e.g., one of the memories 1624 or 1625 of FIG. 16 ) (block 1410 ) and determines whether another photograph is to be merged with the merged photographic image 900 generated at block 1412 (block 1416 ). For example, numerous photographic images of a product display unit can be merged to form a panoramic image of that product display unit such as, for example, the panoramic image 1800 of FIG. 18 . If the post processing system 1221 determines that it should merge another photograph with the merged photographic image 900 , the post processing system 1221 retrieves the next photograph to be merged (block 1418 ) and control returns to the operation of block 1404 . Otherwise, the example process of FIG. 14 is ended.
- a memory e.g., one of the memories 1624 or 1625 of FIG. 16
- FIG. 15 is a flow diagram depicting an example method that may be used to process user input information (e.g., zone tags, product codes, etc.) related to the photographic images collected and processed in connection with the example methods of FIGS. 13 and 14 .
- the example method of FIG. 15 is implemented using the example categorization GUI 1100 of FIG. 11 .
- the example method of FIG. 15 can be implemented using a processor system (e.g., a computer, a terminal, a server, etc.) at a central facility or some other post processing site to which the survey data collected by the cart 200 is communicated.
- the cart 200 may be configured to implement the example method of FIG. 15 .
- the display interface 1222 ( FIG. 12 ) displays the image categorization user interface 1100 of FIG. 11 (block 1502 ) and a user-requested photographic image (block 1504 ) in the image display area 1104 ( FIG. 11 ).
- the user input interface 1230 receives a zone tag (block 1506 ) selected by a user via the zone tags drop down list 1108 ( FIG. 11 ).
- the user input interface 1230 receives one or more product codes (block 1508 ) selected by the user via the product codes selection control 1110 ( FIG. 11 ).
- the zone associator 1224 ( FIG. 12 ) stores the zone tag in association with a photographic image identifier of the displayed photographic image (block 1510 ) in, for example, the memory 1228 .
- the product code associator 1226 ( FIG. 12 ) stores the product code(s) in association with the photographic image identifier (block 1512 ) in, for example, the memory 1228 .
- the example apparatus 1200 determines whether it should display another photographic image (block 1514 ). For example, if the user selects another photographic image for display, control returns to block 1504 . Otherwise, if the user closes the image categorization user interface 1100 , the example method of FIG. 15 ends.
- FIG. 16 is a block diagram of an example processor system that may be used to implement some or all of the example methods and apparatus described herein.
- the processor system 1610 includes a processor 1612 that is coupled to an interconnection bus 1614 .
- the processor 1612 may be any suitable processor, processing unit or microprocessor.
- the system 1610 may be a multi-processor system and, thus, may include one or more additional processors that are identical or similar to the processor 1612 and that are communicatively coupled to the interconnection bus 1614 .
- the processor 1612 of FIG. 16 is coupled to a chipset 1618 , which includes a memory controller 1620 and an input/output (I/O) controller 1622 .
- a chipset typically provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to the chipset 1618 .
- the memory controller 1620 performs functions that enable the processor 1612 (or processors if there are multiple processors) to access a system memory 1624 and a mass storage memory 1625 .
- the system memory 1624 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc.
- the mass storage memory 1625 may include any desired type of mass storage device including hard disk drives, optical drives, tape storage devices, etc.
- the I/O controller 1622 performs functions that enable the processor 1612 to communicate with peripheral input/output (I/O) devices 1626 and 1628 and a network interface 1630 via an I/O bus 1632 .
- the I/O devices 1626 and 1628 may be any desired type of I/O device such as, for example, a keyboard, a video display or monitor, a mouse, etc.
- the network interface 1630 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 device, a DSL modem, a cable modem, a cellular modem, etc. that enables the processor system 1610 to communicate with another processor system.
- ATM asynchronous transfer mode
- memory controller 1620 and the I/O controller 1622 are depicted in FIG. 16 as separate functional blocks within the chipset 1618 , the functions performed by these blocks may be integrated within a single semiconductor circuit or may be implemented using two or more separate integrated circuits.
Abstract
Description
- The present disclosure relates generally to consumer monitoring and, more particularly, to methods and apparatus to survey a retail environment.
- Retail establishments and product manufacturers are often interested in the shopping activities, behaviors, and/or habits of people in a retail environment. Consumer activity related to shopping can be used to correlate product sales with particular shopping behavior and/or to improve placements of products, advertisements, and/or other product-related information in a retail environment. Known techniques for monitoring consumer activities in retail establishments include conducting surveys, counting patrons, and/or conducting visual inspections of shoppers or patrons in the retail establishments. Such techniques are often developed by a market research entity based on products and/or services offered in the retail establishment. The names of products and/or services available in a retail establishment can be obtained from store inventory lists developed by retail employees. However, such inventory lists may not include locations of items in the retail establishment to be able to associate a consumer's activity in a particular location with particular products at that location.
-
FIG. 1 illustrates a plan view of an example retail establishment having a plurality of product category zones. -
FIG. 2 illustrates an isometric view of an example surveying cart that may be used to implement the example methods and apparatus described herein to survey the retail establishment ofFIG. 1 . -
FIG. 3 depicts a rear view of the example surveying cart ofFIG. 2 . -
FIG. 4 illustrates an example walk-through path in the example retail establishment ofFIG. 1 that may be used to perform a survey of the retail establishment. -
FIG. 5 depicts products placed on a shelving system of the example retail establishment ofFIGS. 1 and 4 . -
FIGS. 6A and 6B depict example photographic images of the shelving system and products ofFIG. 5 captured in succession using the example surveying cart ofFIG. 2 . -
FIGS. 7A and 7B depict the example photographic images ofFIGS. 6A and 6B having discard areas indicative of portions of the photographic images to be discarded prior to a stitching process. -
FIGS. 8A and 8B depict cropped photographic image portions of the example photographic images ofFIGS. 6A , 6B, 7A, and 7B useable for an image stitching process. -
FIG. 9 depicts an example stitched photographic image composition formed using the example cropped photographic image portions ofFIGS. 8A and 8B . -
FIG. 10 is an example navigation assistant graphical user interface (GUI) that may be used to display cart speed status of the example cart ofFIG. 2 to assist a person in pushing the cart around the retail environment ofFIG. 1 . -
FIG. 11 is an example graphical user interface that may be used to display photographic images and receive user input associated with categorizing the photographic images. -
FIG. 12 is a block diagram of an example apparatus that may be used to implement the example methods described herein to perform product surveys of retail establishments. -
FIG. 13 is a flow diagram of an example method that may be used to collect and process photographic images of retail establishment environments. -
FIG. 14 is a flow diagram of an example method that may be used to merge images of products displayed in a retail establishment to generate merged, stitched, and/or panoramic images of the displayed products. -
FIG. 15 is a flow diagram depicting an example method that may be used to process user input information related to the photographic images collected and processed in connection with the example method ofFIGS. 13 and 14 . -
FIG. 16 is a block diagram of an example processor system that may be used to implement some or all of the example methods and apparatus described herein. -
FIG. 17 is a partial view of a cart having a light source and an optical sensor to implement an optical-based dead reckoning system to determine location information indicative of locations traversed by the cart in a retail establishment. -
FIG. 18 is an example panoramic image formed using numerous captured images of products displayed in a retail establishment. - Although the following discloses example methods, apparatus, and systems including, among other components, software executed on hardware, it should be noted that such methods, apparatus, and systems are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, or in any combination of hardware and software. Accordingly, while the following describes example methods, apparatus, and systems, the examples provided are not the only way to implement such methods, apparatus, and systems.
- The example methods and apparatus described herein may be used to survey products in a retail establishment. For example, the example methods and apparatus may be used to determine the types of products and their locations in a retail establishment to generate a product layout or map of the retail establishment. The product layout can then be used in connection with, for example, consumer behavior monitoring systems and/or consumer surveys to enable product manufacturers to better understand shoppers and how to reach and influence shoppers that buy goods in retail establishments. For example, the in-store product layout can be used to determine when products were on shelves so that shoppers could have been exposed to those products to have the opportunity to purchase those products. The example methods and apparatus described herein can be used to generate product layout maps that can be correlated with purchasing histories to determine how those product layouts affected consumer purchases. In some example implementations, the information about the types of products in retail establishments can be used to confirm that products are temporally and spatially correctly placed in the retail establishments.
- The example methods and apparatus described herein can be implemented using a mobile cart having wheels and cameras mounted thereto. A survey person can push the mobile cart through a retail establishment (e.g., through product aisles, through checkout lanes, through storefront areas, etc.) as the cameras capture photographs of products placed in the surrounding areas. To capture the photographs, a retail establishment may be partitioned into multiple areas of interest (e.g., category-based areas, product aisles, etc.). Sequentially captured photographic images for each area of interest are then stitched to form a uniform, continuous panoramic photographic image of those areas of interest. Identifiers for the stitched photographic images can be stored in a database in association with information about products placed in the areas corresponding to those stitched photographic images. To enable users to store information in connection with the photographic images, the cart used to capture the photographic images may be provided with an application having a user interface to display the in-store photographic images and receive user inputs. Alternatively, such an application can be provided at a computer system separate from the cart. The example methods and apparatus described herein may be implemented using any suitable image type including, for example, photographic images captured using a digital still camera, still-picture or freeze-frame video images captured from a video stream, or any other type of suitable image. For purposes of discussion, the example methods and apparatus are described herein as being implemented using photographic images.
- Turning to
FIG. 1 , anexample retail establishment 100 includes a plurality of product category zones 102 a-h. In the illustrated example, theretail establishment 100 is a grocery store. However, the example methods and apparatus described herein can be used to survey product layouts in other types of retail establishments (e.g., department stores, clothing stores, specialty stores, hardware stores, etc.). The product category zones 102 a-h are assigned sequential numerical values and include a first zone (1) 102 a, a second zone (2) 102 b, a third zone (3) 102 c, a fourth zone (4) 102 d, a fifth zone (5) 102 e, a sixth zone (6) 102 f, a seventh zone (7) 102 g, and an eighth zone (8) 102 h. A zone is an area of a retail establishment in which a shopper can be expected to have the opportunity to be exposed to products. The boundaries of a zone may relate to product layout throughout the retail establishment and/or natural boundaries that a person could relatively easily perceive. In some example implementations, zones are created based on the types of products that are sold in particular areas of a retail establishment. In the illustrated example, the first zone (1) 102 a corresponds to a checkout line category, the second zone (2) 102 b corresponds to a canned goods category, the third zone (3) 102 c corresponds to a frozen foods category, the fourth zone (4) 102 d corresponds to a household goods category, the fifth zone (5) 102 e corresponds to a dairy category, the sixth zone (6) 102 f corresponds to a meats category, the seventh zone (7) 102 g corresponds to a bakery category, and the eighth zone (8) 102 h corresponds to a produce category. A department store may have other types of zones in addition to or instead of the category zones 102 a-h ofFIG. 1 that may include, for example, a women's clothing zone, a men's clothing zone, a children's clothing zone, a household appliance zone, an automotive hardware zone, a seasonal items zone, a pharmacy zone, etc. In some example implementations, surveys of retail establishments may be conducted as described herein without using zones. - In preparation for surveying a particular retail establishment, the retailer may provide a map showing store layout characteristics. The map can be scanned into a database configured to store scanned maps for a plurality of other monitored retail establishments. In addition to providing the map or alternatively, the retailer can also provide a planogram, which is a diagram, a drawing, or other visual description of a retail establishment's layout, including placement of particular products and product categories. If the retailer cannot provide such information, an audit can be performed of the retailer's establishment by performing a walk through and collecting information indicative of products, product categories, and placements of the same throughout the retail establishment. In any case, a category zone map (e.g., the plan view of the
retail establishment 100 ofFIG. 1 ) can be created by importing a scanned map and a planogram or other similar information (e.g., audit information) and adding the category zone information (e.g., the category zones 102 a-h ofFIG. 1 ) to the map based on the planogram information (or similar information). - In the illustrated examples describe herein, each of the category zones 102 a-h is created based on a shopper's point of view (e.g., a shopper's exposure to different areas as the shopper moves throughout the retail establishment). In this manner, the store survey information collected using the example methods and apparatus described herein can be used to make correlations between shoppers' locations in the retail establishment and the opportunity those shoppers had to consume or be exposed to in-store products. For example, a category zone can be created based on a shopper's line of sight when walking down a particular aisle. The category zones can also be created based on natural boundaries throughout a retail establishment such as, for example, changes in floor tile or carpeting, visual obstructions, enclosed areas such as greeting card centers, floral centers, and garden centers.
-
FIG. 2 is an isometric view andFIG. 3 is a rear view of anexample surveying cart 200 that may be used to perform surveys of retail establishments (e.g., the exampleretail establishment 100 ofFIG. 1 ). As shown inFIG. 2 , theexample surveying cart 200 includes a base 202 having afront side 204, arear side 206, and twoperipheral sides cart 200 includes wheels 212 a-b rotatably coupled to the base 202 to facilitate moving thecart 200 throughout a retail establishment (e.g., theretail establishment 100 ofFIG. 1 ) during a survey process. To facilitate maneuvering or turning thecart 200, acaster 214 is coupled to the front side 204 (but in other example implementations may be coupled to the rear side 206) of thebase 202. Theexample surveying cart 200 also includes ahandle 216 coupled to therear side 206 to facilitate pushing thecart 200 throughout a retail establishment. - In the illustrated example, each of the wheels 212 a-b is independently rotatably coupled to the
base 202 via respective arbors 217 a-b as shown inFIG. 3 to enable each of the wheels 212 a-b to rotate independently of the other when, for example, a user pushes thecart 200 in a turning or swerving fashion (e.g., around a corner, not in a straight line, etc.). In addition, to track the speed and traveling distance of thecart 200, each of the wheels 212 a-b is operatively coupled to a respective rotary encoder 218 a-b. The rotary encoders 218 a-b may alternatively be implemented using any other suitable sensors to detect speed and/or travel distance. To ensure relatively accurate speed and distance detection, the wheels 212 a-b can be implemented using a soft rubber material creating sufficient friction with floor surface materials (e.g., tile, ceramic, concrete, sealant coatings, etc.) so that the wheels 212 a-b do not slip when thecart 200 is pushed throughout retail establishments. - In the illustrated example, the rotary encoders 218 a-b can also be used to implement a wheel-based dead reckoning system to determine the locations traveled by the
cart 200 throughout theretail establishment 100. Independently rotatably coupling the wheels 212 a-b to thebase 202 enables using the differences between the travel distance measured by therotary encoder 218 a and the travel distance measured by therotary encoder 218 b to determine when thecart 200 is turning or is not proceeding in a straight line. - To capture photographic images of products in store aisles, two
cameras cart 200 in an outwardly facing configuration so that the cameras 220 a-b have a field of view substantially opposing theperipheral sides cart 200. Each of the cameras 220 a-b may be implemented using a digital still camera, a video camera, a web camera, or any other suitable type of camera. In some example implementations, the cameras 220 a-b may be implemented using high-quality (e.g., high pixel count) digital still cameras to capture high quality photographic images to facilitate accurate optical character recognition and/or image object recognition processing of the captured photographic images. In the illustrated example, the cameras 220 a-b are mounted to thecart 200 so that their fields of view are in substantially perpendicular configurations relative to the direction of travel of thecart 200. To control the image captures of the cameras 220 a-b, a shutter trigger signal of each camera may be controlled based on the movement of the wheels 212 a-b. For example, thecart 200 may be configured to trigger the cameras 220 a-b to capture an image each time the wheels 212 a-b rotate a particular number of times based on signals output by one or both of the encoders 218 a-b. In this manner, the image capturing operations of the cameras 212 a-b can be automated based on the travel distance of thecart 200. - To display captured photographic images, information associated with those photographic images and any other survey related information, the
example surveying cart 200 is provided with adisplay 222. In the illustrated example, thedisplay 222 is equipped with a touchscreen interface to enable users to interact with applications using astylus 224. Example graphical user interfaces that may be presented on thedisplay 222 in connection with operations of theexample surveying cart 200 are described below in connection withFIGS. 10 and 11 . - To determine distances between the
cart 200 and products (e.g., product shelves, product racks, etc.), theexample cart 200 is provided withrange sensors peripheral sides sensors FIG. 2 and designated by numeral 228) in one of theperipheral sides cart 200. In some example implementations, thecart 200 could be provided with two or more range sensors on each of theperipheral sides range sensor 226 a, thesensor 226 a may measure an invalid or incorrect distance or range, but another range sensor mounted lower on thecart 200 as indicated inFIGS. 2 and 3 by a phantom line andreference numeral 227 can measure the distance or range to the product placed lower than the height of therange sensor 226 a. Any number of range sensors substantially similar or identical to the range sensors 226 a-b can be provided on each of theperipheral sides cart 200. -
FIG. 4 illustrates an example walk-throughsurvey path 400 in the exampleretail establishment 100 ofFIG. 1 that may be used to perform a survey of theretail establishment 100 using theexample surveying cart 200 ofFIGS. 2 and 3 . Specifically, a person can push the surveyingcart 200 through theretail establishment 100 in a path generally indicated by the walk-throughsurvey path 400 while the surveyingcart 200 captures successive photographic images of products placed on shelves, stands, racks, refrigerators, freezers, etc. As the surveyingcart 200 captures photographic images or after the surveyingcart 200 has captured all of the photographic images of theretail establishment 100, the surveyingcart 200 or another post-processing system (e.g., a post-processing system located at a central facility) can stitch or merge corresponding successive photographic images to create continuous panoramic photographic images of product display units (e.g., shelves, stands, racks, refrigerators, freezers, etc.) arranged in respective aisles or zones. Each stitched, merged, or otherwise compiled photographic image can subsequently be used during an analysis phase to determine placements of products within theretail establishment 100 and within each of the category zones 102 a-h (FIG. 1 ) of theretail establishment 100. In the illustrated example, thesurvey path 400 proceeds along peripheral areas of theretail establishment 100 and then through aisles. However, other survey paths that proceed along different routes and zone orderings may be used instead. - To determine distances between each of the cameras 220 a-b of the cart 200 (
FIG. 2 ) and respective target products that are photographed, the range sensors 226 a-b measure distances atrange measuring points 402 along thepath 400. In some instances in which products are placed on both sides of thecart 200, both of the range sensors 226 a-b measure distances on respective sides of thecart 200. For instances in which target products are located only on one side of thecart 200, only a corresponding one of the sensors 226 a-b may measure a distance. The distance measurements can be used to measure the widths and overall sizes of shopping areas (e.g., aisle widths, aisle length and/or area size, etc.) and/or category zones. -
FIG. 5 depicts an arrangement of products 502 placed on ashelving system 504 of the exampleretail establishment 100 ofFIGS. 1 and 4 . The arrangement of products 502 is used to illustrate an example technique that may be used to capture successive photographic images of products throughout theretail establishment 100 and stitch or merge the photographic images to form a compilation of successively captured photographic images as a unitary continuous panoramic photographic image depicting products arranged on a product display unit (e.g., shelves, stands, racks, refrigerators, freezers, etc.) of a corresponding aisle or zone. - Turning to
FIGS. 6A and 6B , when thecart 200 captures photographic images of the arrangement of products 502, it does so by capturing two successive photographic images, one of which is shown inFIG. 6A and designated asimage A 602 and the other of which is shown inFIG. 6B and designated asimage B 652.Image A 602 corresponds to a first section 506 (FIG. 5 ) of the arrangement of products 502, andimage B 652 corresponds to a second section 508 (FIG. 5 ) of the arrangement of products 502. A merging or stitching process is used to joinimage A 602 andimage B 652 along an area that is common to both of theimages - To begin the merging or stitching process,
FIG. 7A showsperipheral areas image A 602 andFIG. 7B showsperipheral areas image B 654 that are identified as areas to be discarded. Theseareas FIGS. 2 and 3 . The parallax effect makes objects in theperipheral areas photographic images peripheral area 606 ofimage A 602 corresponds to theperipheral area 654 ofimage B 652, but the products 502 in respective ones of theperipheral area peripheral areas peripheral areas FIGS. 8A and 8B to create croppedphotographic images peripheral merge areas FIG. 9 ) of theimages images -
FIG. 9 depicts an example stitched or mergedphotographic image composition 900 formed using the example croppedphotographic images FIGS. 8A and 8B . In the illustrated example, mergeareas photographic images areas merge areas photographic image composition 900 by joining (e.g., overlapping, integrating, etc.) the croppedphotographic images merge areas panoramic image 1800 ofFIG. 18 . In the illustrated example ofFIG. 18 , thepanoramic image 1800 is formed by merging thephotographic images photographic images merge area 1806, thephotographic images merge area 1808, and thephotographic images merge area 1810. Although four photographs are shown as being merged to form thepanoramic image 1800 inFIG. 18 , any number of photographs may be merged to form a panoramic image of products on display in a retail establishment. -
FIG. 10 is an example navigation assistant graphical user interface (GUI) 1000 that may be used to display cart speed status of the example cart 200 (FIG. 2 ) to assist a person in pushing thecart 200 in or around a retail environment (e.g., theretail environment 100 ofFIG. 1 ). In the illustrated example, thenavigation assistant GUI 1000 includes a path oftravel display area 1002 to display a path oftravel plot 1004 indicative of the locations traversed by thecart 200 during a survey. The path oftravel plot 1004 is generated based on location information determined using travel distance information generated by the encoders 218 a-b. In some example implementations, the path oftravel plot 1004 can be generated using filtering algorithms, averaging algorithms or other signal processing algorithms to make the path oftravel plot 1004 relatively more accurate, smooth, and/or consistent. In the illustrated example, the path oftravel display area 1002 is also used to display astore layout map 1006. In some example implementations, thestore layout map 1006 may be indicative of the locations of store furniture (e.g., shelves, counters, stands, etc.) and/or product category zones, and the survey information collected using the example methods and apparatus described herein can be used to determine locations of particular products, advertisements, etc. in thelayout map 1006. In other example implementations, thestore layout map 1006 may not be displayed. For example, a store layout map of a store being surveyed may not yet exist, but the survey information collected as described herein may subsequently be used to generate a store layout map. - The
navigation assistant GUI 1000 is provided with anotification area 1008 to display guidance messages on whether a user should decrease the speed of thecart 200. Also, thenavigation assistant GUI 1000 is provided with aspeedometer display 1010. As a user pushes thecart 200, the user should attempt to keep the speed of thecart 200 lower than a predetermined maximum speed. An acceptable speed may be predetermined or preselected based on one or more criteria including, a camera shutter speed, an environment lighting speed, the size of theretail environment 100 to be surveyed within a given duration, etc. - To display the location of the
cart 200, thenavigation assistant GUI 1000 is provided with alocation display area 1012. The location information displayed in thelocation display area 1012 can be generated using location generation devices or location receiving devices of thecart 200. In the illustrated example, thelocation display area 1012 displays Cartesian coordinates (X, Y), but may alternatively be used to display other types of location information. To display the number of photographic images that have been captured during a survey, thenavigation assistant GUI 1000 is provided with an image capturedcounter 1014. - To initialize the
cart 200 before beginning a survey of a retail establishment, thenavigation assistant GUI 1000 is provided with an initialize button 10016. In the illustrated example, a user may initialize thecart 200 by positioning thecart 200 to face a direction that is in accordance with the orientation of thestore layout map 1006 shown in the path oftravel display area 1002 ofFIG. 10 . Alternatively or additionally, thenotification area 1008 can be used to display the direction in which thecart 200 should initially be facing before beginning a survey. The initial direction information displayed in thenotification area 1008 can be displayed as store feature information and can include messages such as, for example, face the rear wall of the store, face the front windows of the store, etc. When thecart 200 is positioned in accordance with thestore layout map 1006 and/or the direction in thenotification area 1008, the user can select theinitialize button 1016 to set a current location of thecart 200 to zero (e.g., location coordinates X,Y=0,0). In this manner, subsequent location information can be generated by thecart 200 relative to the zeroed initial location. -
FIG. 11 is an example categorization graphical user interface (GUI) 1100 that may be used to display photographic images and receive user input associated with categorizing the photographic images. A person can use the categorization GUI 1100 during or after performing a survey of a retail establishment to retrieve and navigate between the various captured photographic images and tag those images with data pertaining to zones of a store (e.g., the zones 102 a-h ofFIG. 1 ). In some example implementations, the example categorization GUI 1100 and its related operations can be implemented using a processor system (e.g., a computer, a terminal, a server, etc.) at a central facility or some other post processing site to which the survey data collected by thecart 200 is communicated. In other example implementations, thecart 200 may be configured to implement the example categorization GUI 1100 and its related operations. - To retrieve photographic images for a particular store, the categorization GUI 1100 is provided with a ‘select store’ menu 1102 via which a person can select the retail establishment for which the person would like to analyze photographic images. To display photographic images, the categorization GUI 1100 is provided with an image display area 1104. In some example implementations, the displayed photographic image is a merged photographic image (e.g., the merged
photographic image 900 ofFIG. 9 ) while in other example implementations, the displayed photographic image is not a merged photographic image (e.g., one of thephotographic images FIGS. 6A and 6B ). To display location information indicative of a location within a retail environment (e.g., theretail environment 100 ofFIG. 1 ) corresponding to each photographic image displayed in the image display area 1104, the categorization GUI 1100 is provided with a location display area 1106. In the illustrated example, the location display area 1106 displays Cartesian coordinates (X, Y), but may alternatively be used to display other types of location information. To tag each photographic image with a respective zone identifier, the categorization GUI 1100 also includes a zone tags drop down list 1108 that is populated with a plurality of zones created for the retail establishment associated with the retrieved photographic image. A person can select a zone from the zone tags drop down list 1108 corresponding to the photographic image displayed in the image display area 1104 to associate the selected zone identifier with the displayed photographic image. - To associate product codes indicative of the products (e.g., the products 502 of
FIG. 5 ) shown in the photographic image displayed in the image display area 1104, the categorization GUI 1100 is provided with a product codes selection control 1110. A person may select the product codes associated with the products shown in the displayed photographic image to associate the selected product codes with the displayed photographic image and the zone selected in the zone tags drop down list 1108. In some example implementations, the person may drag and drop zone tags and/or product codes from the zone tags drop down list 1108 and/or the product codes selection control 1110 to the image display area 1104 to associate those selected zone tags and/or product codes with the displayed photographic image. - In some example implementations, product codes in the product code selection control 1110 can be selected automatically using a character recognition and/or an image recognition process used to recognize products (e.g., types of products, product names, product brands, etc.) in images. That is, after the character and/or image recognition process detects particular product(s) in the image display area 1104, one or more corresponding product codes can be populated in the zone tags drop down list 1108 based on the product(s) detected using the recognition process.
- To add new product codes, the categorization GUI 1100 is provided with an add product code field 1112. When a person sees a new product for which a product code does not exist in the product codes selection control 1110, the person may add the product code for the new product in the add product code field 1112. The categorization GUI 1100 can be configured to subsequently display the newly added product code in the product codes selection control 1110.
-
FIG. 12 is a block diagram of anexample apparatus 1200 that may be used to implement the example methods described herein to perform product surveys of retail establishments (e.g., theretail establishment 100 ofFIG. 1 ). Theexample apparatus 1200 may be implemented using any desired combination of hardware, firmware, and/or software. For example, one or more integrated circuits, discrete semiconductor components, and/or passive electronic components may be used. Additionally or alternatively, some or all of the blocks of theexample apparatus 1200, or parts thereof, may be implemented using instructions, code, and/or other software and/or firmware, etc. stored on a machine accessible medium that, are executed by, for example, a processor system (e.g., theexample processor system 1610 ofFIG. 16 ). - To receive and/or generate speed information based on information from the rotary encoders 218 a-b for each of the wheels 212 a-b of
FIG. 2 , theexample apparatus 1200 is provided with aspeed detector interface 1202. For example, thespeed detector interface 1202 may receive rotary encoding information from the rotary encoders 218 a-b and generate first speed information indicative of the speed of thefirst wheel 212 a and second speed information indicative of the speed of thesecond wheel 212 b based on that received information. Alternatively, if the rotary encoders 218 a-b are configured to generate speed information, thespeed detector interface 1202 can receive the speed information from the encoders 218 a-b for each of the wheels 212 a-b. In some example implementations, thespeed detector interface 1202 can use averaging operations to process the speed information for each wheel 212 a-b for display to a user via, for example, thenavigation assistant GUI 1000 ofFIG. 10 . - To monitor the speed of the
cart 200, theexample apparatus 1200 is provided with aspeed monitor 1204. In the illustrated example, thespeed monitor 1204 is configured to monitor the speed information generated by thespeed detector interface 1202 to determine whether thecart 200 is moving too fast during a product survey. A speed indicator value generated by thespeed monitor 1204 can be used to present corresponding messages in thenotification area 1008 ofFIG. 10 to notify a person pushing thecart 200 whether to decrease the speed of thecart 200 or to keep moving at the same pace. - To receive distance information measured by the range sensors 226 a-b of
FIG. 2 , theexample apparatus 1200 is provided with arange detector interface 1206. In the illustrated example therange detector interface 1206 is configured to receive distance information from the range sensors 226 a-b at, for example, each of the range measuring points 402 depicted inFIG. 4 . The distance information may be used to determine the distances between each of the cameras 220 a-b and respective target products photographed by the cameras 220 a-b. - To receive photographic images from the cameras 220 a-b, the
example apparatus 1200 is provided with animage capture interface 1208. To store data (e.g., photographic images, zone tags, product codes, location information, speed information, notification messages, etc.) in amemory 1228 and/or retrieve data from thememory 1228, theexample apparatus 1200 is provided with adata interface 1210. In the illustrated example, thedata interface 1210 is also configured to transfer survey data from thecart 200 to a post-processing system (e.g., thepost processing system 1221 described below). - To generate location information, the
example apparatus 1200 is provided with alocation information generator 1212. Thelocation information generator 1212 can be implemented using, for example, a dead reckoning system implemented using thespeed detector interface 1202 and one or more motion detectors (e.g., an accelerometer, a gyroscope, etc.). In the illustrated example, to generate location information using dead reckoning techniques, thelocation information generator 1212 is configured to receive speed information from thespeed detector interface 1202 for each of the wheels 212 a-b of thecart 200. In this manner, thelocation information generator 1212 can monitor when and how far thecart 200 has moved to determine travel distances of thecart 200. In addition, to determine when thecart 200 is turning or swerving, thelocation information generator 1212 can analyze the respective speed information of each of thewheels cart 200 is turning or swerving. For example, if the rotational speed of the left wheel is relatively slower than the rotational speed of the right wheel, thelocation information generator 1212 can determine that thecart 200 is being turned in a left direction. In some instances, the rotary encoder 218 a-b may not be completely accurate (e.g., encoder output data may exhibit some drift) and/or the wheels 212 a-b may occasionally lose traction with a floor and slip, thereby, preventing travel information of thecart 200 from being detected by the rotary encoders 218 a-b. To compensate for or correct such errors or inaccuracies, thelocation information generator 1212 can use motion information generated by one or more motion detectors (e.g., an accelerometer, a gyroscope, etc.) as reference information to determine if correction to location information generated based on wheel speeds should be corrected or adjusted. That is, while wheel speed information can be used to generate relatively more accurate travel distance and location information than using motion detectors alone, when wheel slippage or rotational encoder inaccuracies occur, the motion sensor(s) continuously output movement information as long as thecart 200 is moving, and such motion sensor information can be used to make minor adjustments to the travel distance and/or location information derived using the wheel speed information. - In alternative example implementations, the
location information generator 1212 can be implemented using an optical-based dead reckoning system that detects travel distances and turning or swerving by thecart 200 using a light source and an optical sensor. For example, referring toFIG. 17 illustrating a partial view of acart 1700, thelocation information generator 1212 can be communicatively coupled to alight source 1702 and an optical sensor 1704 (e.g., a black and white complimentary metal-oxide semiconductor (CMOS) image capture sensor) mounted to the bottom of acart 1700. In the illustrated example, thecart 1700 is substantially similar or identical to thecart 200 except for the addition of thelight source 1702 and theoptical sensor 1704. In addition, the rotational encoders 218 a-b can be omitted from thecart 1700 because thelight source 1702 and theoptical sensor 1704 would provide travel distance and turning or swerving information. In the illustrated example, thelight source 1702 is used to illuminate anarea 1706 of floor or surface on which thecart 1700 travels and theoptical sensor 1704 captures successive images of anoptical capture area 1708 on the surface that are used to determine the speed and directions of travel of thecart 1700. For example, to determine paths of travel (e.g., the paths oftravel 400 ofFIG. 4 and/or 1004 ofFIG. 10 ) and, thus, location information of thecart 1700, thelocation information generator 1212 can be configured to perform an optical flow algorithm that compares the images successively captured by theoptical sensor 1704 to one another to determine motion, direction, and the speed of travel of thecart 1700. The optical flow algorithm is well known in the art and, thus, is not described in greater detail. - In some example implementations, the
location information generator 1212 can also receive camera-to-product distance information from therange detector interface 206 to determine where in a store aisle between two product racks thecart 200 is positioned. This information may be used to display a store layout map in a graphical user interface similar to the store layout ofFIG. 4 and display a path of travel on the store layout map to show a user where in the store the user is moving thecart 200. The location information generated by thelocation information generator 1212 can be associated with respective photographic images captured by the cameras 220 a-b. In this manner, the location information for each photographic image can be displayed in, for example, the location information area 1106 ofFIG. 11 . Although thelocation information generator 1212 is described as being implemented using a dead reckoning device, any other location information generation or collection technologies can alternatively be used to implement thelocation information generator 1212. - To generate path of travel information based on, for example, the location information generated by the
location information generator 1212, theexample apparatus 1200 is provided with atravel path generator 1214. The path of travel information can be used to generate a path of travel through a retail establishment for display to a user while performing a product survey as, for example, described above in connection withFIG. 10 . - To perform character recognition and/or image object recognition (e.g., line detection, blob detection, etc.) on photographic images captured by the cameras 220 a-b, the
example apparatus 1200 is provided with an image featuresdetector 1216. The image featuresdetector 1216 can be used to recognize products (e.g., types of products, product names, product brands, etc.) in images in connection with, for example, the image categorization GUI 1100 for use in associating product codes in the product codes selection control 1110 with photographic images. The image featuresdetector 1216 can also be configured to identify themerge areas FIG. 9 to merge the croppedimages - To crop images for a merging process, the
example apparatus 1200 is provided with animage cropper 1218. For example, referring toFIGS. 8A and 8B , theimage cropper 1218 may crop theperipheral areas photographic images photographic images - To merge or stitch sequentially captured photographic images to form a stitched or merged panoramic photographic image of a product rack, the
example apparatus 1200 is provided with animage merger 1220. For example, referring toFIG. 9 , theimage merger 1220 can be used to merge the croppedimages merge areas image compilation 900. - In some example implementations, the image features
detector 1216, theimage cropper 1218, and theimage merger 1220 can be omitted from theexample apparatus 1200 and can instead be implemented as apost processing system 1221 located at a central facility or at some other post processing site (not shown). For example, after theexample apparatus 1200 captures and stores images, theapparatus 1200 can upload or communicate the images to thepost processing system 1221, and thepost processing system 1221 can process the images to form the stitched or merged panoramic photographic images. - To display information via the
display 222 of thecart 200 ofFIG. 2 , theexample apparatus 1200 is provided with adisplay interface 1222. For example, the display interface may be used to generate and display thenavigation assistant GUI 1000 ofFIG. 10 and the image categorization GUI 1100 ofFIG. 11 . In addition, theexample display interface 1222 may be used to generate and display a layout map of a surveyed retail establishment and a real-time path of travel of thecart 200 as thecart 200 is moved throughout the surveyed retail establishment. - To associate zone information (e.g., the zone tags of the zone tags drop down list 1108 of
FIG. 11 ) with corresponding captured photographic images (e.g., photographic images displayed in the image display area 1104 ofFIG. 11 ), theexample apparatus 1200 is provided with azone associator 1224. In addition, to associate product code information (e.g., the product codes of the product codes selection control 1110 ofFIG. 11 ) with corresponding captured photographic images (e.g., photographic images displayed in the image display area 1104 ofFIG. 11 ), theexample apparatus 1200 is provided with aproduct code associator 1226. To receive user selections of zone tags and product codes, theexample apparatus 1200 is provided with auser input interface 1230. -
FIGS. 13 , 14, and 15 depict flow diagrams of example methods that may be used to collect and process photographic images of retail establishment environments. In the illustrated example, the example methods ofFIGS. 13 , 14, and 15 are described as being implemented using theexample apparatus 1200. In some example implementations, the example methods ofFIGS. 13 , 14, and 15 may be implemented using machine readable instructions comprising one or more programs for execution by a processor (e.g., theprocessor 1612 shown in theexample processor system 1610 ofFIG. 16 ). The program(s) may be embodied in software stored on one or more tangible media such as CD-ROM's, a floppy disks, hard drives, digital versatile disks (DVD's), or memories associated with a processor system (e.g., theprocessor system 1610 ofFIG. 16 ) and/or embodied in firmware and/or dedicated hardware in a well-known manner. Further, although the example methods are described with reference to the flow diagrams illustrated inFIGS. 13 , 14, and 15, persons of ordinary skill in the art will readily appreciate that many other methods of implementing the example methods may alternatively be used. For example, the order of execution of blocks or operations may be changed, and/or some of the blocks or operations described may be changed, eliminated, or combined. - Turning in detail to
FIG. 13 , initially the cart 200 (FIGS. 2 and 3 ) is initialized (block 1302). For example, an initial location of thecart 200 in theretail establishment 100 can be set in thelocation information generator 1212 to its known location (e.g., an initial reference location) to generate subsequent location information using dead reckoning techniques. As discussed above in connection withFIG. 10 , a user may initialize thecart 200 by positioning thecart 200 to face a direction that is in accordance with the orientation of thestore layout map 1006 shown in the path oftravel display area 1002 ofFIG. 10 and/or in accordance with direction information displayed in thenotification area 1008 ofFIG. 10 . When thecart 200 is positioned in accordance with thestore layout map 1006 or the direction in thenotification area 1008, the user can select theinitialize button 1016 to set a current location of thecart 200 to zero, and thecart 200 can subsequently generate location information relative to the zeroed initial location. - After a user places the
cart 200 in motion (block 1304), the speed detector interface 1202 (FIG. 12 ) measures a speed of the cart 200 (block 1306). For example, thespeed detector interface 1202 can receive information from the rotary encoders 218 a-b and can generate speed information for each of the wheels 212 a-b (and/or an average speed of both of the wheels 212 a-b) based on the received rotary encoder information. Thedisplay interface 1222 then displays the speed information (block 1308) via the display 222 (FIG. 2 ). For example, thedisplay interface 1222 can display the speed information via the speedometer display 1010 (FIG. 10 ). - The speed monitor 1204 (
FIG. 12 ) determines whether the speed of thecart 200 is acceptable (block 1310). For example, thespeed monitor 1204 may compare the speed generated atblock 1306 with a speed threshold or a speed limit (e.g., a predetermined maximum speed threshold) to determine whether thecart 200 is moving at an acceptable speed. An acceptable speed may be predetermined or preselected based on one or more criteria including, a camera shutter speed, an environment lighting speed, the size of theretail environment 100 to be surveyed within a given duration, etc. If the speed is not acceptable (block 1310) (e.g., the speed of thecart 200 is too fast), thespeed monitor 1204 causes thedisplay interface 1222 to display textual and/or color-coded speed feedback indicators to inform a user to improve the speed of the cart 200 (block 1312). For example, thespeed monitor 1204 may cause thespeed interface 1222 to display a notification message in the notification area 1008 (FIG. 10 ) to decrease the speed of thecart 200. - After displaying the textual and/or color-coded speed feedback indicators (block 1312) or if the
speed monitor 1204 determines that the speed of thecart 200 is acceptable (block 1310), theimage capture interface 1208 receives and stores successively captured photographic images (e.g., thephotographic images FIGS. 6A and 6B ) from each of the cameras 220 a-b (block 1314). For example, theimage capture interface 1208 may be configured to trigger the camera 220 a-b to capture photographic images at periodic intervals which may be based on a distance traveled by thecart 200. Theimage capture interface 1208 may obtain the distance traveled by the cart from thespeed detector interface 1202 and/or from thelocation information generator 1212. The distance traveled by thecart 200 may be provided in linear measurement units (e.g., inches, feet, yards, etc.) or may be provided in encoding units generated by the rotary encoders 218 a-b. Theimage capture interface 1208 then tags each of the photographic images with a respective photo identifier (block 1316). - The location information generator 1212 (
FIG. 12 ) collects (or generates) location information corresponding to the location of thecart 200 when each of the photographic images was captured at block 1314 (block 1318). The data interface 1210 then stores the location information generated atblock 1318 in association with each respective photo identifier (block 1320) in, for example, thememory 1228. Theexample apparatus 1200 then determines whether it should continue to acquire photographic images (block 1322). For example, if the product survey is not complete, theexample apparatus 1200 may determine that it should continue to acquire photographic images (block 1322), in which case control is returned toblock 1306. Otherwise, if the product survey is complete, theexample apparatus 1200 may determine that it should no longer continue to acquire photographic images (block 1322). - If the
example apparatus 1200 determines that it should no longer continue to acquire photographic images (block 1322), thedata interface 1210 communicates the stored images, location information, and photo identifiers to the post processing system 1221 (FIG. 12 ) (block 1324), and thepost processing system 1221 merges the images (block 1326) to form panoramic images of product displays. An example process that may be used to implement the example image merging process ofblock 1326 is described below in connection withFIG. 14 . The example process ofFIG. 13 is then ended. Although the image merging process ofblock 1326 is described as being performed by thepost processing system 1221 separate from theapparatus 1200 that is implemented on thecart 200, in other example implementations, the image merging process ofblock 1326 can be performed by theexample apparatus 1200 at thecart 200. - Turning to the flow diagram of
FIG. 14 , to merge the images captured using thecart 200, initially, thepost processing system 1221 selects photographs to be merged (block 1402). For example, the post processing system can select thephotographic images FIGS. 6A and 6B . The image features detector 1216 (FIG. 12 ) locates the edge portions of the photographic images to be merged (block 1404). For example, the image featuresdetector 1216 can locate theperipheral areas photographic images FIG. 12 ) can then discard the edge portions (block 1406) identified atblock 1402. For example, theimage cropper 1218 can discard theedge portions photographic images FIGS. 8A and 8B . - The image features
detector 1216 then identifies merge areas in the croppedphotographic images 802 and 852 (block 1408) generated atblock 1404. For example, the image featuresdetector 1216 can identify themerge areas FIG. 9 based on having corresponding, overlapping edges and/or image objects based on the ones of the products 502 appearing in thoseareas image merger 1220 then overlays the croppedphotographic images merge areas 902 and 904 (block 1410) and merges the croppedphotographic images 802 and 852 (block 1412) to create the merged or stitchedphotographic image composition 900 ofFIG. 9 . Thepost processing system 1221 then stores the mergedphotographic image 900 in a memory (e.g., one of thememories FIG. 16 ) (block 1410) and determines whether another photograph is to be merged with the mergedphotographic image 900 generated at block 1412 (block 1416). For example, numerous photographic images of a product display unit can be merged to form a panoramic image of that product display unit such as, for example, thepanoramic image 1800 ofFIG. 18 . If thepost processing system 1221 determines that it should merge another photograph with the mergedphotographic image 900, thepost processing system 1221 retrieves the next photograph to be merged (block 1418) and control returns to the operation ofblock 1404. Otherwise, the example process ofFIG. 14 is ended. -
FIG. 15 is a flow diagram depicting an example method that may be used to process user input information (e.g., zone tags, product codes, etc.) related to the photographic images collected and processed in connection with the example methods ofFIGS. 13 and 14 . In the illustrated example, the example method ofFIG. 15 is implemented using the example categorization GUI 1100 ofFIG. 11 . In some example implementations, the example method ofFIG. 15 can be implemented using a processor system (e.g., a computer, a terminal, a server, etc.) at a central facility or some other post processing site to which the survey data collected by thecart 200 is communicated. In other example implementations, thecart 200 may be configured to implement the example method ofFIG. 15 . - Initially, the display interface 1222 (
FIG. 12 ) displays the image categorization user interface 1100 ofFIG. 11 (block 1502) and a user-requested photographic image (block 1504) in the image display area 1104 (FIG. 11 ). Theuser input interface 1230 then receives a zone tag (block 1506) selected by a user via the zone tags drop down list 1108 (FIG. 11 ). In addition, theuser input interface 1230 receives one or more product codes (block 1508) selected by the user via the product codes selection control 1110 (FIG. 11 ). The zone associator 1224 (FIG. 12 ) stores the zone tag in association with a photographic image identifier of the displayed photographic image (block 1510) in, for example, thememory 1228. The product code associator 1226 (FIG. 12 ) stores the product code(s) in association with the photographic image identifier (block 1512) in, for example, thememory 1228. Theexample apparatus 1200 then determines whether it should display another photographic image (block 1514). For example, if the user selects another photographic image for display, control returns to block 1504. Otherwise, if the user closes the image categorization user interface 1100, the example method ofFIG. 15 ends. -
FIG. 16 is a block diagram of an example processor system that may be used to implement some or all of the example methods and apparatus described herein. As shown inFIG. 16 , theprocessor system 1610 includes aprocessor 1612 that is coupled to aninterconnection bus 1614. Theprocessor 1612 may be any suitable processor, processing unit or microprocessor. Although not shown inFIG. 16 , thesystem 1610 may be a multi-processor system and, thus, may include one or more additional processors that are identical or similar to theprocessor 1612 and that are communicatively coupled to theinterconnection bus 1614. - The
processor 1612 ofFIG. 16 is coupled to achipset 1618, which includes amemory controller 1620 and an input/output (I/O)controller 1622. As is well known, a chipset typically provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to thechipset 1618. Thememory controller 1620 performs functions that enable the processor 1612 (or processors if there are multiple processors) to access asystem memory 1624 and amass storage memory 1625. - The
system memory 1624 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc. Themass storage memory 1625 may include any desired type of mass storage device including hard disk drives, optical drives, tape storage devices, etc. - The I/
O controller 1622 performs functions that enable theprocessor 1612 to communicate with peripheral input/output (I/O)devices network interface 1630 via an I/O bus 1632. The I/O devices network interface 1630 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 device, a DSL modem, a cable modem, a cellular modem, etc. that enables theprocessor system 1610 to communicate with another processor system. - While the
memory controller 1620 and the I/O controller 1622 are depicted inFIG. 16 as separate functional blocks within thechipset 1618, the functions performed by these blocks may be integrated within a single semiconductor circuit or may be implemented using two or more separate integrated circuits. - Although the above description refers to the flowcharts as being representative of methods, those methods may be implemented entirely or in part by executing machine readable instructions. Therefore, the flowcharts are representative of methods and machine readable instructions.
- Although certain methods, apparatus, and articles of manufacture have been described herein, the scope of coverage of this patent is not limited thereto. To the contrary, this patent covers all methods, apparatus, and articles of manufacture fairly falling within the scope of the appended claims either literally or under the doctrine of equivalents.
Claims (34)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/019,280 US20090192921A1 (en) | 2008-01-24 | 2008-01-24 | Methods and apparatus to survey a retail environment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/019,280 US20090192921A1 (en) | 2008-01-24 | 2008-01-24 | Methods and apparatus to survey a retail environment |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090192921A1 true US20090192921A1 (en) | 2009-07-30 |
Family
ID=40900202
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/019,280 Abandoned US20090192921A1 (en) | 2008-01-24 | 2008-01-24 | Methods and apparatus to survey a retail environment |
Country Status (1)
Country | Link |
---|---|
US (1) | US20090192921A1 (en) |
Cited By (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080186193A1 (en) * | 2007-02-06 | 2008-08-07 | Fujitsu Limited | Information terminal, method and apparatus for providing store information, and computer product |
US20100039682A1 (en) * | 2008-08-18 | 2010-02-18 | Waterloo Industries, Inc. | Systems And Arrangements For Object Identification |
US20120050520A1 (en) * | 2010-08-24 | 2012-03-01 | Raytheon Company | Method and Apparatus for Anti-Biofouling of Optics in Liquid Environments |
WO2012131181A1 (en) * | 2011-04-01 | 2012-10-04 | CVDM Solutions | Method for the automated extraction of a planogram from images of shelving |
US20130117053A2 (en) * | 2011-03-17 | 2013-05-09 | Patrick Campbell | On-shelf tracking system |
US8445864B2 (en) | 2011-08-26 | 2013-05-21 | Raytheon Company | Method and apparatus for anti-biofouling of a protected surface in liquid environments |
US8577136B1 (en) * | 2010-12-28 | 2013-11-05 | Target Brands, Inc. | Grid pixelation enhancement for in-stock analytics |
US20140052562A1 (en) * | 2012-08-17 | 2014-02-20 | Ebay Inc. | System and method for interactive and social shopping |
CN103902149A (en) * | 2012-12-29 | 2014-07-02 | 联想(北京)有限公司 | Data processing method and device |
US8917902B2 (en) | 2011-08-24 | 2014-12-23 | The Nielsen Company (Us), Llc | Image overlaying and comparison for inventory display auditing |
US8923893B2 (en) | 2012-08-07 | 2014-12-30 | Symbol Technologies, Inc. | Real-time planogram generation and maintenance |
CN104363385A (en) * | 2014-10-29 | 2015-02-18 | 复旦大学 | Line-oriented hardware implementing method for image fusion |
CN104363384A (en) * | 2014-10-29 | 2015-02-18 | 复旦大学 | Row-based hardware suture method in video fusion |
US20150088701A1 (en) * | 2013-09-23 | 2015-03-26 | Daniel Norwood Desmarais | System and method for improved planogram generation |
US9135491B2 (en) | 2007-08-31 | 2015-09-15 | Accenture Global Services Limited | Digital point-of-sale analyzer |
EP2867834A4 (en) * | 2012-06-29 | 2016-02-10 | Intel Corp | Image-augmented inventory management and wayfinding |
US20160373647A1 (en) * | 2015-06-18 | 2016-12-22 | The Nielsen Company (Us), Llc | Methods and apparatus to capture photographs using mobile devices |
US9569692B2 (en) | 2014-10-31 | 2017-02-14 | The Nielsen Company (Us), Llc | Context-based image recognition for consumer market research |
US20170068840A1 (en) * | 2015-09-09 | 2017-03-09 | Accenture Global Services Limited | Predicting accuracy of object recognition in a stitched image |
US9621505B1 (en) * | 2013-07-20 | 2017-04-11 | Google Inc. | Providing images with notifications |
US9652737B1 (en) * | 2015-11-03 | 2017-05-16 | Bullet Scanning, Llc | System and method for inventory identification and quantification |
US9654761B1 (en) * | 2013-03-15 | 2017-05-16 | Google Inc. | Computer vision algorithm for capturing and refocusing imagery |
US20170177195A1 (en) * | 2015-12-18 | 2017-06-22 | Ricoh Co., Ltd. | Image Recognition Scoring Visualization |
US20170177172A1 (en) * | 2015-12-18 | 2017-06-22 | Ricoh Co., Ltd. | Image Recognition Result Visualization Over Time |
US9740955B2 (en) * | 2014-02-28 | 2017-08-22 | Ricoh Co., Ltd. | Method for product recognition from multiple images |
US20170278056A1 (en) * | 2014-09-30 | 2017-09-28 | Nec Corporation | Information processing apparatus, control method, and program |
US9776219B2 (en) | 2013-01-17 | 2017-10-03 | Raytheon Company | Method and apparatus for removing biofouling from a protected surface in a liquid environment |
US20170357939A1 (en) * | 2016-06-10 | 2017-12-14 | Wal-Mart Stores, Inc. | Methods and Systems for Monitoring a Retail Shopping Facility |
EP3309727A1 (en) * | 2016-10-17 | 2018-04-18 | Conduent Business Services LLC | Store shelf imaging system and method |
US10024718B2 (en) | 2014-01-02 | 2018-07-17 | Triangle Strategy Group Llc | Methods, systems, and computer readable media for tracking human interactions with objects using modular sensor segments |
US10083453B2 (en) | 2011-03-17 | 2018-09-25 | Triangle Strategy Group, LLC | Methods, systems, and computer readable media for tracking consumer interactions with products using modular sensor units |
US10122915B2 (en) * | 2014-01-09 | 2018-11-06 | Trax Technology Solutions Pte Ltd. | Method and device for panoramic image processing |
US10176452B2 (en) | 2014-06-13 | 2019-01-08 | Conduent Business Services Llc | Store shelf imaging system and method |
US10223737B2 (en) * | 2015-12-28 | 2019-03-05 | Samsung Electronics Co., Ltd. | Automatic product mapping |
US20190213534A1 (en) * | 2018-01-10 | 2019-07-11 | Trax Technology Solutions Pte Ltd. | Withholding notifications due to temporary misplaced products |
US10352689B2 (en) | 2016-01-28 | 2019-07-16 | Symbol Technologies, Llc | Methods and systems for high precision locationing with depth values |
US10368662B2 (en) | 2013-05-05 | 2019-08-06 | Trax Technology Solutions Pte Ltd. | System and method of monitoring retail units |
US10378956B2 (en) | 2011-03-17 | 2019-08-13 | Triangle Strategy Group, LLC | System and method for reducing false positives caused by ambient lighting on infra-red sensors, and false positives caused by background vibrations on weight sensors |
US10387996B2 (en) | 2014-02-02 | 2019-08-20 | Trax Technology Solutions Pte Ltd. | System and method for panoramic image processing |
US10402777B2 (en) | 2014-06-18 | 2019-09-03 | Trax Technology Solutions Pte Ltd. | Method and a system for object recognition |
US10417696B2 (en) * | 2015-12-18 | 2019-09-17 | Ricoh Co., Ltd. | Suggestion generation based on planogram matching |
US10452707B2 (en) | 2015-08-31 | 2019-10-22 | The Nielsen Company (Us), Llc | Product auditing in point-of-sale images |
US10505057B2 (en) | 2017-05-01 | 2019-12-10 | Symbol Technologies, Llc | Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera |
US10521914B2 (en) | 2017-09-07 | 2019-12-31 | Symbol Technologies, Llc | Multi-sensor object recognition system and method |
US10552933B1 (en) | 2015-05-20 | 2020-02-04 | Digimarc Corporation | Image processing methods and arrangements useful in automated store shelf inspections |
US10565548B2 (en) * | 2016-03-29 | 2020-02-18 | Bossa Nova Robotics Ip, Inc. | Planogram assisted inventory system and method |
US10572763B2 (en) | 2017-09-07 | 2020-02-25 | Symbol Technologies, Llc | Method and apparatus for support surface edge detection |
US10592854B2 (en) | 2015-12-18 | 2020-03-17 | Ricoh Co., Ltd. | Planogram matching |
US10591918B2 (en) | 2017-05-01 | 2020-03-17 | Symbol Technologies, Llc | Fixed segmented lattice planning for a mobile automation apparatus |
US10663590B2 (en) | 2017-05-01 | 2020-05-26 | Symbol Technologies, Llc | Device and method for merging lidar data |
JP2020086660A (en) * | 2018-11-19 | 2020-06-04 | ワム・システム・デザイン株式会社 | Information processing apparatus, information processing method, and program |
US10726273B2 (en) | 2017-05-01 | 2020-07-28 | Symbol Technologies, Llc | Method and apparatus for shelf feature and object placement detection from shelf images |
US10731970B2 (en) | 2018-12-13 | 2020-08-04 | Zebra Technologies Corporation | Method, system and apparatus for support structure detection |
US10740911B2 (en) | 2018-04-05 | 2020-08-11 | Symbol Technologies, Llc | Method, system and apparatus for correcting translucency artifacts in data representing a support structure |
US10796262B2 (en) | 2015-09-30 | 2020-10-06 | The Nielsen Company (Us), Llc | Interactive product auditing with a mobile device |
US10809078B2 (en) | 2018-04-05 | 2020-10-20 | Symbol Technologies, Llc | Method, system and apparatus for dynamic path generation |
US10823572B2 (en) | 2018-04-05 | 2020-11-03 | Symbol Technologies, Llc | Method, system and apparatus for generating navigational data |
US10832436B2 (en) | 2018-04-05 | 2020-11-10 | Symbol Technologies, Llc | Method, system and apparatus for recovering label positions |
US10949798B2 (en) | 2017-05-01 | 2021-03-16 | Symbol Technologies, Llc | Multimodal localization and mapping for a mobile automation apparatus |
US11003188B2 (en) | 2018-11-13 | 2021-05-11 | Zebra Technologies Corporation | Method, system and apparatus for obstacle handling in navigational path generation |
US11010920B2 (en) | 2018-10-05 | 2021-05-18 | Zebra Technologies Corporation | Method, system and apparatus for object detection in point clouds |
US11015938B2 (en) | 2018-12-12 | 2021-05-25 | Zebra Technologies Corporation | Method, system and apparatus for navigational assistance |
US11042161B2 (en) | 2016-11-16 | 2021-06-22 | Symbol Technologies, Llc | Navigation control method and apparatus in a mobile automation system |
US11080566B2 (en) | 2019-06-03 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for gap detection in support structures with peg regions |
US11079240B2 (en) | 2018-12-07 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for adaptive particle filter localization |
US11093896B2 (en) | 2017-05-01 | 2021-08-17 | Symbol Technologies, Llc | Product status detection system |
US11090811B2 (en) | 2018-11-13 | 2021-08-17 | Zebra Technologies Corporation | Method and apparatus for labeling of support structures |
US11100303B2 (en) | 2018-12-10 | 2021-08-24 | Zebra Technologies Corporation | Method, system and apparatus for auxiliary label detection and association |
US11107238B2 (en) * | 2019-12-13 | 2021-08-31 | Zebra Technologies Corporation | Method, system and apparatus for detecting item facings |
US11126861B1 (en) | 2018-12-14 | 2021-09-21 | Digimarc Corporation | Ambient inventorying arrangements |
US11151743B2 (en) | 2019-06-03 | 2021-10-19 | Zebra Technologies Corporation | Method, system and apparatus for end of aisle detection |
US11167421B2 (en) | 2018-08-10 | 2021-11-09 | Miso Robotics, Inc. | Robotic kitchen assistant including universal utensil gripping assembly |
US11200677B2 (en) | 2019-06-03 | 2021-12-14 | Zebra Technologies Corporation | Method, system and apparatus for shelf edge detection |
EP3937064A1 (en) * | 2020-07-09 | 2022-01-12 | Kemas GmbH | Detection device |
USD945507S1 (en) | 2020-06-30 | 2022-03-08 | Bossa Nova Robotics Ip, Inc. | Mobile robot for object detection |
WO2022051038A1 (en) * | 2020-09-02 | 2022-03-10 | Suzy, Inc. | Gamified market research survey interface |
US11327504B2 (en) | 2018-04-05 | 2022-05-10 | Symbol Technologies, Llc | Method, system and apparatus for mobile automation apparatus localization |
US11341663B2 (en) | 2019-06-03 | 2022-05-24 | Zebra Technologies Corporation | Method, system and apparatus for detecting support structure obstructions |
US11351673B2 (en) * | 2017-03-06 | 2022-06-07 | Miso Robotics, Inc. | Robotic sled-enhanced food preparation system and related methods |
US11367092B2 (en) | 2017-05-01 | 2022-06-21 | Symbol Technologies, Llc | Method and apparatus for extracting and processing price text from an image set |
US11392891B2 (en) | 2020-11-03 | 2022-07-19 | Zebra Technologies Corporation | Item placement detection and optimization in material handling systems |
US11402846B2 (en) | 2019-06-03 | 2022-08-02 | Zebra Technologies Corporation | Method, system and apparatus for mitigating data capture light leakage |
US11416000B2 (en) | 2018-12-07 | 2022-08-16 | Zebra Technologies Corporation | Method and apparatus for navigational ray tracing |
US11449059B2 (en) | 2017-05-01 | 2022-09-20 | Symbol Technologies, Llc | Obstacle detection for a mobile automation apparatus |
US11450024B2 (en) | 2020-07-17 | 2022-09-20 | Zebra Technologies Corporation | Mixed depth object detection |
US11506483B2 (en) | 2018-10-05 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for support structure depth determination |
US11507103B2 (en) | 2019-12-04 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for localization-based historical obstacle handling |
US11577401B2 (en) | 2018-11-07 | 2023-02-14 | Miso Robotics, Inc. | Modular robotic food preparation system and related methods |
US11593915B2 (en) | 2020-10-21 | 2023-02-28 | Zebra Technologies Corporation | Parallax-tolerant panoramic image generation |
US11592826B2 (en) | 2018-12-28 | 2023-02-28 | Zebra Technologies Corporation | Method, system and apparatus for dynamic loop closure in mapping trajectories |
US11600084B2 (en) | 2017-05-05 | 2023-03-07 | Symbol Technologies, Llc | Method and apparatus for detecting and interpreting price label text |
US11618155B2 (en) | 2017-03-06 | 2023-04-04 | Miso Robotics, Inc. | Multi-sensor array including an IR camera as part of an automated kitchen assistant system for recognizing and preparing food and related methods |
US11662739B2 (en) | 2019-06-03 | 2023-05-30 | Zebra Technologies Corporation | Method, system and apparatus for adaptive ceiling-based localization |
US20230274225A1 (en) * | 2022-01-31 | 2023-08-31 | Walmart Apollo, Llc | Methods and apparatus for generating planograms |
US11744403B2 (en) | 2021-05-01 | 2023-09-05 | Miso Robotics, Inc. | Automated bin system for accepting food items in robotic kitchen workspace |
US11774842B2 (en) | 2019-08-16 | 2023-10-03 | Bossa Nova Robotics Ip, Inc. | Systems and methods for image capture and shelf content detection |
WO2023168093A3 (en) * | 2022-03-04 | 2023-10-12 | Agile Displays Llc | Shopping store planning and operations using large-scale distributed radio infrastructure |
US11822333B2 (en) | 2020-03-30 | 2023-11-21 | Zebra Technologies Corporation | Method, system and apparatus for data capture illumination control |
US11847832B2 (en) | 2020-11-11 | 2023-12-19 | Zebra Technologies Corporation | Object classification for autonomous navigation systems |
US11954882B2 (en) | 2021-06-17 | 2024-04-09 | Zebra Technologies Corporation | Feature-based georegistration for mobile computing devices |
US11960286B2 (en) | 2019-06-03 | 2024-04-16 | Zebra Technologies Corporation | Method, system and apparatus for dynamic task sequencing |
Citations (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4373133A (en) * | 1980-01-03 | 1983-02-08 | Nicholas Clyne | Method for producing a bill, apparatus for collecting items, and a self-service shop |
US4973952A (en) * | 1987-09-21 | 1990-11-27 | Information Resources, Inc. | Shopping cart display system |
US5294781A (en) * | 1991-06-21 | 1994-03-15 | Ncr Corporation | Moving course data collection system |
US5640002A (en) * | 1995-08-15 | 1997-06-17 | Ruppert; Jonathan Paul | Portable RF ID tag and barcode reader |
US5692444A (en) * | 1995-10-11 | 1997-12-02 | Stork Brabant B.V. | Cleaning device for use in cleaning a paste supply system of a rotary screen printing machine |
US5821513A (en) * | 1996-06-26 | 1998-10-13 | Telxon Corporation | Shopping cart mounted portable data collection device with tethered dataform reader |
US6026376A (en) * | 1997-04-15 | 2000-02-15 | Kenney; John A. | Interactive electronic shopping system and method |
US6304284B1 (en) * | 1998-03-31 | 2001-10-16 | Intel Corporation | Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera |
US6304855B1 (en) * | 1993-11-30 | 2001-10-16 | Raymond R. Burke | Computer system for allowing a consumer to purchase packaged goods at home |
US20020007295A1 (en) * | 2000-06-23 | 2002-01-17 | John Kenny | Rental store management system |
US20020010655A1 (en) * | 2000-05-25 | 2002-01-24 | Realitybuy, Inc. | Real time, three-dimensional, configurable, interactive product display system and method |
US20030055707A1 (en) * | 1999-09-22 | 2003-03-20 | Frederick D. Busche | Method and system for integrating spatial analysis and data mining analysis to ascertain favorable positioning of products in a retail environment |
US6574614B1 (en) * | 1996-07-15 | 2003-06-03 | Brad Kesel | Consumer feedback apparatus |
US6584375B2 (en) * | 2001-05-04 | 2003-06-24 | Intellibot, Llc | System for a retail environment |
US20040224703A1 (en) * | 2003-05-09 | 2004-11-11 | Takaki Steven M. | Method and system for enhancing venue participation by venue participants |
US20050035198A1 (en) * | 2003-01-23 | 2005-02-17 | Wilensky Craig A. | Mobile wireless computer system including devices and methods related thereto |
US6911908B1 (en) * | 1999-10-08 | 2005-06-28 | Activerf Limited | Security |
US6928343B2 (en) * | 2003-07-30 | 2005-08-09 | International Business Machines Corporation | Shopper tracker and portable customer service terminal charger |
US20060010030A1 (en) * | 2004-07-09 | 2006-01-12 | Sorensen Associates Inc | System and method for modeling shopping behavior |
US7064783B2 (en) * | 1999-12-31 | 2006-06-20 | Stmicroelectronics, Inc. | Still picture format for subsequent picture stitching for forming a panoramic image |
US7080061B2 (en) * | 1999-09-30 | 2006-07-18 | Hill-Rom Services, Inc. | Portable locator system |
US7148803B2 (en) * | 2003-10-24 | 2006-12-12 | Symbol Technologies, Inc. | Radio frequency identification (RFID) based sensor networks |
US7155336B2 (en) * | 2004-03-24 | 2006-12-26 | A9.Com, Inc. | System and method for automatically collecting images of objects at geographic locations and displaying same in online directories |
US20070071038A1 (en) * | 2005-09-23 | 2007-03-29 | Via Technologies, Inc. | Serial transceiver and method of transmitting package |
US7206753B2 (en) * | 2001-05-04 | 2007-04-17 | Axxon Robotics, Llc | Methods for facilitating a retail environment |
US20080002914A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Enhancing text in images |
US20080002893A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Recognizing text in images |
US20080002916A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Using extracted image text |
US20080170803A1 (en) * | 2007-01-12 | 2008-07-17 | Babak Forutanpour | Panoramic imaging techniques |
US7420464B2 (en) * | 2004-03-15 | 2008-09-02 | Arbitron, Inc. | Methods and systems for gathering market research data inside and outside commercial establishments |
US20080249867A1 (en) * | 2007-04-03 | 2008-10-09 | Robert Lee Angell | Method and apparatus for using biometric data for a customer to improve upsale and cross-sale of items |
US20080306787A1 (en) * | 2005-04-13 | 2008-12-11 | Craig Hamilton | Method and System for Automatically Measuring Retail Store Display Compliance |
US20090063307A1 (en) * | 2007-08-31 | 2009-03-05 | Groenovelt Robert Bernand Robin | Detection Of Stock Out Conditions Based On Image Processing |
US20090128644A1 (en) * | 2007-11-15 | 2009-05-21 | Camp Jr William O | System and method for generating a photograph |
US7561192B2 (en) * | 2006-02-14 | 2009-07-14 | Canon Kabushiki Kaisha | Image capturing apparatus, control method therefor, program, and storage medium |
US7681796B2 (en) * | 2006-01-05 | 2010-03-23 | International Business Machines Corporation | Mobile device tracking |
-
2008
- 2008-01-24 US US12/019,280 patent/US20090192921A1/en not_active Abandoned
Patent Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4373133A (en) * | 1980-01-03 | 1983-02-08 | Nicholas Clyne | Method for producing a bill, apparatus for collecting items, and a self-service shop |
US4973952A (en) * | 1987-09-21 | 1990-11-27 | Information Resources, Inc. | Shopping cart display system |
US5287266A (en) * | 1987-09-21 | 1994-02-15 | Videocart, Inc. | Intelligent shopping cart system having cart position determining capability |
US5294781A (en) * | 1991-06-21 | 1994-03-15 | Ncr Corporation | Moving course data collection system |
US6304855B1 (en) * | 1993-11-30 | 2001-10-16 | Raymond R. Burke | Computer system for allowing a consumer to purchase packaged goods at home |
US5640002A (en) * | 1995-08-15 | 1997-06-17 | Ruppert; Jonathan Paul | Portable RF ID tag and barcode reader |
US5692444A (en) * | 1995-10-11 | 1997-12-02 | Stork Brabant B.V. | Cleaning device for use in cleaning a paste supply system of a rotary screen printing machine |
US5821513A (en) * | 1996-06-26 | 1998-10-13 | Telxon Corporation | Shopping cart mounted portable data collection device with tethered dataform reader |
US6574614B1 (en) * | 1996-07-15 | 2003-06-03 | Brad Kesel | Consumer feedback apparatus |
US6026376A (en) * | 1997-04-15 | 2000-02-15 | Kenney; John A. | Interactive electronic shopping system and method |
US6304284B1 (en) * | 1998-03-31 | 2001-10-16 | Intel Corporation | Method of and apparatus for creating panoramic or surround images using a motion sensor equipped camera |
US20030055707A1 (en) * | 1999-09-22 | 2003-03-20 | Frederick D. Busche | Method and system for integrating spatial analysis and data mining analysis to ascertain favorable positioning of products in a retail environment |
US7080061B2 (en) * | 1999-09-30 | 2006-07-18 | Hill-Rom Services, Inc. | Portable locator system |
US6911908B1 (en) * | 1999-10-08 | 2005-06-28 | Activerf Limited | Security |
US7064783B2 (en) * | 1999-12-31 | 2006-06-20 | Stmicroelectronics, Inc. | Still picture format for subsequent picture stitching for forming a panoramic image |
US20020010655A1 (en) * | 2000-05-25 | 2002-01-24 | Realitybuy, Inc. | Real time, three-dimensional, configurable, interactive product display system and method |
US20020007295A1 (en) * | 2000-06-23 | 2002-01-17 | John Kenny | Rental store management system |
US7206753B2 (en) * | 2001-05-04 | 2007-04-17 | Axxon Robotics, Llc | Methods for facilitating a retail environment |
US6584375B2 (en) * | 2001-05-04 | 2003-06-24 | Intellibot, Llc | System for a retail environment |
US20050035198A1 (en) * | 2003-01-23 | 2005-02-17 | Wilensky Craig A. | Mobile wireless computer system including devices and methods related thereto |
US20040224703A1 (en) * | 2003-05-09 | 2004-11-11 | Takaki Steven M. | Method and system for enhancing venue participation by venue participants |
US6928343B2 (en) * | 2003-07-30 | 2005-08-09 | International Business Machines Corporation | Shopper tracker and portable customer service terminal charger |
US7148803B2 (en) * | 2003-10-24 | 2006-12-12 | Symbol Technologies, Inc. | Radio frequency identification (RFID) based sensor networks |
US7420464B2 (en) * | 2004-03-15 | 2008-09-02 | Arbitron, Inc. | Methods and systems for gathering market research data inside and outside commercial establishments |
US7155336B2 (en) * | 2004-03-24 | 2006-12-26 | A9.Com, Inc. | System and method for automatically collecting images of objects at geographic locations and displaying same in online directories |
US20060010030A1 (en) * | 2004-07-09 | 2006-01-12 | Sorensen Associates Inc | System and method for modeling shopping behavior |
US20080306787A1 (en) * | 2005-04-13 | 2008-12-11 | Craig Hamilton | Method and System for Automatically Measuring Retail Store Display Compliance |
US20070071038A1 (en) * | 2005-09-23 | 2007-03-29 | Via Technologies, Inc. | Serial transceiver and method of transmitting package |
US7681796B2 (en) * | 2006-01-05 | 2010-03-23 | International Business Machines Corporation | Mobile device tracking |
US7561192B2 (en) * | 2006-02-14 | 2009-07-14 | Canon Kabushiki Kaisha | Image capturing apparatus, control method therefor, program, and storage medium |
US20080002916A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Using extracted image text |
US20080002893A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Recognizing text in images |
US20080002914A1 (en) * | 2006-06-29 | 2008-01-03 | Luc Vincent | Enhancing text in images |
US20080170803A1 (en) * | 2007-01-12 | 2008-07-17 | Babak Forutanpour | Panoramic imaging techniques |
US20080249867A1 (en) * | 2007-04-03 | 2008-10-09 | Robert Lee Angell | Method and apparatus for using biometric data for a customer to improve upsale and cross-sale of items |
US20090063307A1 (en) * | 2007-08-31 | 2009-03-05 | Groenovelt Robert Bernand Robin | Detection Of Stock Out Conditions Based On Image Processing |
US20090128644A1 (en) * | 2007-11-15 | 2009-05-21 | Camp Jr William O | System and method for generating a photograph |
Cited By (138)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7710267B2 (en) * | 2007-02-06 | 2010-05-04 | Fujitsu Limited | Information terminal, method and apparatus for providing store information, and computer product |
US20080186193A1 (en) * | 2007-02-06 | 2008-08-07 | Fujitsu Limited | Information terminal, method and apparatus for providing store information, and computer product |
US9135491B2 (en) | 2007-08-31 | 2015-09-15 | Accenture Global Services Limited | Digital point-of-sale analyzer |
US10078826B2 (en) * | 2007-08-31 | 2018-09-18 | Accenture Global Services Limited | Digital point-of-sale analyzer |
US20100039682A1 (en) * | 2008-08-18 | 2010-02-18 | Waterloo Industries, Inc. | Systems And Arrangements For Object Identification |
US20120050520A1 (en) * | 2010-08-24 | 2012-03-01 | Raytheon Company | Method and Apparatus for Anti-Biofouling of Optics in Liquid Environments |
US8577136B1 (en) * | 2010-12-28 | 2013-11-05 | Target Brands, Inc. | Grid pixelation enhancement for in-stock analytics |
US10083453B2 (en) | 2011-03-17 | 2018-09-25 | Triangle Strategy Group, LLC | Methods, systems, and computer readable media for tracking consumer interactions with products using modular sensor units |
US9727838B2 (en) * | 2011-03-17 | 2017-08-08 | Triangle Strategy Group, LLC | On-shelf tracking system |
US20130117053A2 (en) * | 2011-03-17 | 2013-05-09 | Patrick Campbell | On-shelf tracking system |
US10378956B2 (en) | 2011-03-17 | 2019-08-13 | Triangle Strategy Group, LLC | System and method for reducing false positives caused by ambient lighting on infra-red sensors, and false positives caused by background vibrations on weight sensors |
US9141886B2 (en) | 2011-04-01 | 2015-09-22 | CVDM Solutions | Method for the automated extraction of a planogram from images of shelving |
FR2973540A1 (en) * | 2011-04-01 | 2012-10-05 | CVDM Solutions | METHOD FOR AUTOMATED EXTRACTION OF A PLANOGRAM FROM LINEAR IMAGES |
WO2012131181A1 (en) * | 2011-04-01 | 2012-10-04 | CVDM Solutions | Method for the automated extraction of a planogram from images of shelving |
US9595098B2 (en) | 2011-08-24 | 2017-03-14 | The Nielsen Company (Us), Llc | Image overlaying and comparison for inventory display auditing |
US8917902B2 (en) | 2011-08-24 | 2014-12-23 | The Nielsen Company (Us), Llc | Image overlaying and comparison for inventory display auditing |
US9324171B2 (en) | 2011-08-24 | 2016-04-26 | The Nielsen Company (Us), Llc | Image overlaying and comparison for inventory display auditing |
US8445864B2 (en) | 2011-08-26 | 2013-05-21 | Raytheon Company | Method and apparatus for anti-biofouling of a protected surface in liquid environments |
EP2867834A4 (en) * | 2012-06-29 | 2016-02-10 | Intel Corp | Image-augmented inventory management and wayfinding |
US9418352B2 (en) | 2012-06-29 | 2016-08-16 | Intel Corporation | Image-augmented inventory management and wayfinding |
US8923893B2 (en) | 2012-08-07 | 2014-12-30 | Symbol Technologies, Inc. | Real-time planogram generation and maintenance |
US20140052562A1 (en) * | 2012-08-17 | 2014-02-20 | Ebay Inc. | System and method for interactive and social shopping |
CN103902149A (en) * | 2012-12-29 | 2014-07-02 | 联想(北京)有限公司 | Data processing method and device |
US10710125B2 (en) | 2013-01-17 | 2020-07-14 | Raytheon Company | Method and apparatus for removing biofouling from a protected surface in a liquid environment |
US9776219B2 (en) | 2013-01-17 | 2017-10-03 | Raytheon Company | Method and apparatus for removing biofouling from a protected surface in a liquid environment |
US9654761B1 (en) * | 2013-03-15 | 2017-05-16 | Google Inc. | Computer vision algorithm for capturing and refocusing imagery |
US10368662B2 (en) | 2013-05-05 | 2019-08-06 | Trax Technology Solutions Pte Ltd. | System and method of monitoring retail units |
US9621505B1 (en) * | 2013-07-20 | 2017-04-11 | Google Inc. | Providing images with notifications |
US20150088701A1 (en) * | 2013-09-23 | 2015-03-26 | Daniel Norwood Desmarais | System and method for improved planogram generation |
US10024718B2 (en) | 2014-01-02 | 2018-07-17 | Triangle Strategy Group Llc | Methods, systems, and computer readable media for tracking human interactions with objects using modular sensor segments |
US10122915B2 (en) * | 2014-01-09 | 2018-11-06 | Trax Technology Solutions Pte Ltd. | Method and device for panoramic image processing |
US10387996B2 (en) | 2014-02-02 | 2019-08-20 | Trax Technology Solutions Pte Ltd. | System and method for panoramic image processing |
US9740955B2 (en) * | 2014-02-28 | 2017-08-22 | Ricoh Co., Ltd. | Method for product recognition from multiple images |
US10176452B2 (en) | 2014-06-13 | 2019-01-08 | Conduent Business Services Llc | Store shelf imaging system and method |
US10402777B2 (en) | 2014-06-18 | 2019-09-03 | Trax Technology Solutions Pte Ltd. | Method and a system for object recognition |
US10579962B2 (en) * | 2014-09-30 | 2020-03-03 | Nec Corporation | Information processing apparatus, control method, and program |
US20170278056A1 (en) * | 2014-09-30 | 2017-09-28 | Nec Corporation | Information processing apparatus, control method, and program |
US11900316B2 (en) * | 2014-09-30 | 2024-02-13 | Nec Corporation | Information processing apparatus, control method, and program |
US11288627B2 (en) * | 2014-09-30 | 2022-03-29 | Nec Corporation | Information processing apparatus, control method, and program |
US20220172157A1 (en) * | 2014-09-30 | 2022-06-02 | Nec Corporation | Information processing apparatus, control method, and program |
CN104363385A (en) * | 2014-10-29 | 2015-02-18 | 复旦大学 | Line-oriented hardware implementing method for image fusion |
CN104363384A (en) * | 2014-10-29 | 2015-02-18 | 复旦大学 | Row-based hardware suture method in video fusion |
US9710723B2 (en) | 2014-10-31 | 2017-07-18 | The Nielsen Company (Us), Llc | Context-based image recognition for consumer market research |
US9569692B2 (en) | 2014-10-31 | 2017-02-14 | The Nielsen Company (Us), Llc | Context-based image recognition for consumer market research |
US11587195B2 (en) | 2015-05-20 | 2023-02-21 | Digimarc Corporation | Image processing methods and arrangements useful in automated store shelf inspections |
US10552933B1 (en) | 2015-05-20 | 2020-02-04 | Digimarc Corporation | Image processing methods and arrangements useful in automated store shelf inspections |
US10735645B2 (en) | 2015-06-18 | 2020-08-04 | The Nielsen Company (Us), Llc | Methods and apparatus to capture photographs using mobile devices |
US10136052B2 (en) | 2015-06-18 | 2018-11-20 | The Nielsen Company (Us), Llc | Methods and apparatus to capture photographs using mobile devices |
US9906712B2 (en) * | 2015-06-18 | 2018-02-27 | The Nielsen Company (Us), Llc | Methods and apparatus to facilitate the capture of photographs using mobile devices |
US11336819B2 (en) | 2015-06-18 | 2022-05-17 | The Nielsen Company (Us), Llc | Methods and apparatus to capture photographs using mobile devices |
US20160373647A1 (en) * | 2015-06-18 | 2016-12-22 | The Nielsen Company (Us), Llc | Methods and apparatus to capture photographs using mobile devices |
US11423075B2 (en) * | 2015-08-31 | 2022-08-23 | Nielsen Consumer Llc | Product auditing in point-of-sale images |
US11853347B2 (en) | 2015-08-31 | 2023-12-26 | Nielsen Consumer, Llc | Product auditing in point-of-sale images |
US10452707B2 (en) | 2015-08-31 | 2019-10-22 | The Nielsen Company (Us), Llc | Product auditing in point-of-sale images |
US20170068840A1 (en) * | 2015-09-09 | 2017-03-09 | Accenture Global Services Limited | Predicting accuracy of object recognition in a stitched image |
US9767387B2 (en) * | 2015-09-09 | 2017-09-19 | Accenture Global Services Limited | Predicting accuracy of object recognition in a stitched image |
US10796262B2 (en) | 2015-09-30 | 2020-10-06 | The Nielsen Company (Us), Llc | Interactive product auditing with a mobile device |
US11562314B2 (en) | 2015-09-30 | 2023-01-24 | The Nielsen Company (Us), Llc | Interactive product auditing with a mobile device |
US20170220990A1 (en) * | 2015-11-03 | 2017-08-03 | Bullet Scanning, Llc | System and method for inventory identification and quantification |
US9652737B1 (en) * | 2015-11-03 | 2017-05-16 | Bullet Scanning, Llc | System and method for inventory identification and quantification |
US9852397B2 (en) * | 2015-11-03 | 2017-12-26 | Bullet Scanning, Llc | System and method for inventory identification and quantification |
US10592854B2 (en) | 2015-12-18 | 2020-03-17 | Ricoh Co., Ltd. | Planogram matching |
US20170177172A1 (en) * | 2015-12-18 | 2017-06-22 | Ricoh Co., Ltd. | Image Recognition Result Visualization Over Time |
US10514825B2 (en) * | 2015-12-18 | 2019-12-24 | Ricoh Co., Ltd. | Image recognition result visualization over time |
US10339690B2 (en) * | 2015-12-18 | 2019-07-02 | Ricoh Co., Ltd. | Image recognition scoring visualization |
US10445821B2 (en) | 2015-12-18 | 2019-10-15 | Ricoh Co., Ltd. | Planogram and realogram alignment |
US20170177195A1 (en) * | 2015-12-18 | 2017-06-22 | Ricoh Co., Ltd. | Image Recognition Scoring Visualization |
US10417696B2 (en) * | 2015-12-18 | 2019-09-17 | Ricoh Co., Ltd. | Suggestion generation based on planogram matching |
US10223737B2 (en) * | 2015-12-28 | 2019-03-05 | Samsung Electronics Co., Ltd. | Automatic product mapping |
US10352689B2 (en) | 2016-01-28 | 2019-07-16 | Symbol Technologies, Llc | Methods and systems for high precision locationing with depth values |
US10565548B2 (en) * | 2016-03-29 | 2020-02-18 | Bossa Nova Robotics Ip, Inc. | Planogram assisted inventory system and method |
US10565554B2 (en) * | 2016-06-10 | 2020-02-18 | Walmart Apollo, Llc | Methods and systems for monitoring a retail shopping facility |
GB2567756B (en) * | 2016-06-10 | 2022-03-09 | Walmart Apollo Llc | Methods and systems for monitoring a retail shopping facility |
US20170357939A1 (en) * | 2016-06-10 | 2017-12-14 | Wal-Mart Stores, Inc. | Methods and Systems for Monitoring a Retail Shopping Facility |
EP3309727A1 (en) * | 2016-10-17 | 2018-04-18 | Conduent Business Services LLC | Store shelf imaging system and method |
US20180107999A1 (en) * | 2016-10-17 | 2018-04-19 | Conduent Business Services, Llc | Store shelf imaging system and method |
US10289990B2 (en) * | 2016-10-17 | 2019-05-14 | Conduent Business Services, Llc | Store shelf imaging system and method |
US11042161B2 (en) | 2016-11-16 | 2021-06-22 | Symbol Technologies, Llc | Navigation control method and apparatus in a mobile automation system |
US11618155B2 (en) | 2017-03-06 | 2023-04-04 | Miso Robotics, Inc. | Multi-sensor array including an IR camera as part of an automated kitchen assistant system for recognizing and preparing food and related methods |
US11351673B2 (en) * | 2017-03-06 | 2022-06-07 | Miso Robotics, Inc. | Robotic sled-enhanced food preparation system and related methods |
US11367092B2 (en) | 2017-05-01 | 2022-06-21 | Symbol Technologies, Llc | Method and apparatus for extracting and processing price text from an image set |
US10949798B2 (en) | 2017-05-01 | 2021-03-16 | Symbol Technologies, Llc | Multimodal localization and mapping for a mobile automation apparatus |
US10591918B2 (en) | 2017-05-01 | 2020-03-17 | Symbol Technologies, Llc | Fixed segmented lattice planning for a mobile automation apparatus |
US11449059B2 (en) | 2017-05-01 | 2022-09-20 | Symbol Technologies, Llc | Obstacle detection for a mobile automation apparatus |
US10726273B2 (en) | 2017-05-01 | 2020-07-28 | Symbol Technologies, Llc | Method and apparatus for shelf feature and object placement detection from shelf images |
US11093896B2 (en) | 2017-05-01 | 2021-08-17 | Symbol Technologies, Llc | Product status detection system |
US10505057B2 (en) | 2017-05-01 | 2019-12-10 | Symbol Technologies, Llc | Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera |
US10663590B2 (en) | 2017-05-01 | 2020-05-26 | Symbol Technologies, Llc | Device and method for merging lidar data |
US11600084B2 (en) | 2017-05-05 | 2023-03-07 | Symbol Technologies, Llc | Method and apparatus for detecting and interpreting price label text |
US10521914B2 (en) | 2017-09-07 | 2019-12-31 | Symbol Technologies, Llc | Multi-sensor object recognition system and method |
US10572763B2 (en) | 2017-09-07 | 2020-02-25 | Symbol Technologies, Llc | Method and apparatus for support surface edge detection |
US10628660B2 (en) * | 2018-01-10 | 2020-04-21 | Trax Technology Solutions Pte Ltd. | Withholding notifications due to temporary misplaced products |
US20190213534A1 (en) * | 2018-01-10 | 2019-07-11 | Trax Technology Solutions Pte Ltd. | Withholding notifications due to temporary misplaced products |
US10823572B2 (en) | 2018-04-05 | 2020-11-03 | Symbol Technologies, Llc | Method, system and apparatus for generating navigational data |
US10832436B2 (en) | 2018-04-05 | 2020-11-10 | Symbol Technologies, Llc | Method, system and apparatus for recovering label positions |
US11327504B2 (en) | 2018-04-05 | 2022-05-10 | Symbol Technologies, Llc | Method, system and apparatus for mobile automation apparatus localization |
US10809078B2 (en) | 2018-04-05 | 2020-10-20 | Symbol Technologies, Llc | Method, system and apparatus for dynamic path generation |
US10740911B2 (en) | 2018-04-05 | 2020-08-11 | Symbol Technologies, Llc | Method, system and apparatus for correcting translucency artifacts in data representing a support structure |
US11833663B2 (en) | 2018-08-10 | 2023-12-05 | Miso Robotics, Inc. | Robotic kitchen assistant for frying including agitator assembly for shaking utensil |
US11192258B2 (en) | 2018-08-10 | 2021-12-07 | Miso Robotics, Inc. | Robotic kitchen assistant for frying including agitator assembly for shaking utensil |
US11167421B2 (en) | 2018-08-10 | 2021-11-09 | Miso Robotics, Inc. | Robotic kitchen assistant including universal utensil gripping assembly |
US11010920B2 (en) | 2018-10-05 | 2021-05-18 | Zebra Technologies Corporation | Method, system and apparatus for object detection in point clouds |
US11506483B2 (en) | 2018-10-05 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for support structure depth determination |
US11577401B2 (en) | 2018-11-07 | 2023-02-14 | Miso Robotics, Inc. | Modular robotic food preparation system and related methods |
US11090811B2 (en) | 2018-11-13 | 2021-08-17 | Zebra Technologies Corporation | Method and apparatus for labeling of support structures |
US11003188B2 (en) | 2018-11-13 | 2021-05-11 | Zebra Technologies Corporation | Method, system and apparatus for obstacle handling in navigational path generation |
JP7209280B2 (en) | 2018-11-19 | 2023-01-20 | ワム・システム・デザイン株式会社 | Information processing device, information processing method, and program |
JP2020086660A (en) * | 2018-11-19 | 2020-06-04 | ワム・システム・デザイン株式会社 | Information processing apparatus, information processing method, and program |
US11079240B2 (en) | 2018-12-07 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for adaptive particle filter localization |
US11416000B2 (en) | 2018-12-07 | 2022-08-16 | Zebra Technologies Corporation | Method and apparatus for navigational ray tracing |
US11100303B2 (en) | 2018-12-10 | 2021-08-24 | Zebra Technologies Corporation | Method, system and apparatus for auxiliary label detection and association |
US11015938B2 (en) | 2018-12-12 | 2021-05-25 | Zebra Technologies Corporation | Method, system and apparatus for navigational assistance |
US10731970B2 (en) | 2018-12-13 | 2020-08-04 | Zebra Technologies Corporation | Method, system and apparatus for support structure detection |
US11126861B1 (en) | 2018-12-14 | 2021-09-21 | Digimarc Corporation | Ambient inventorying arrangements |
US11592826B2 (en) | 2018-12-28 | 2023-02-28 | Zebra Technologies Corporation | Method, system and apparatus for dynamic loop closure in mapping trajectories |
US11402846B2 (en) | 2019-06-03 | 2022-08-02 | Zebra Technologies Corporation | Method, system and apparatus for mitigating data capture light leakage |
US11662739B2 (en) | 2019-06-03 | 2023-05-30 | Zebra Technologies Corporation | Method, system and apparatus for adaptive ceiling-based localization |
US11151743B2 (en) | 2019-06-03 | 2021-10-19 | Zebra Technologies Corporation | Method, system and apparatus for end of aisle detection |
US11080566B2 (en) | 2019-06-03 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for gap detection in support structures with peg regions |
US11341663B2 (en) | 2019-06-03 | 2022-05-24 | Zebra Technologies Corporation | Method, system and apparatus for detecting support structure obstructions |
US11960286B2 (en) | 2019-06-03 | 2024-04-16 | Zebra Technologies Corporation | Method, system and apparatus for dynamic task sequencing |
US11200677B2 (en) | 2019-06-03 | 2021-12-14 | Zebra Technologies Corporation | Method, system and apparatus for shelf edge detection |
US11774842B2 (en) | 2019-08-16 | 2023-10-03 | Bossa Nova Robotics Ip, Inc. | Systems and methods for image capture and shelf content detection |
US11507103B2 (en) | 2019-12-04 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for localization-based historical obstacle handling |
US11107238B2 (en) * | 2019-12-13 | 2021-08-31 | Zebra Technologies Corporation | Method, system and apparatus for detecting item facings |
US11822333B2 (en) | 2020-03-30 | 2023-11-21 | Zebra Technologies Corporation | Method, system and apparatus for data capture illumination control |
USD945507S1 (en) | 2020-06-30 | 2022-03-08 | Bossa Nova Robotics Ip, Inc. | Mobile robot for object detection |
EP3937064A1 (en) * | 2020-07-09 | 2022-01-12 | Kemas GmbH | Detection device |
US11450024B2 (en) | 2020-07-17 | 2022-09-20 | Zebra Technologies Corporation | Mixed depth object detection |
WO2022051038A1 (en) * | 2020-09-02 | 2022-03-10 | Suzy, Inc. | Gamified market research survey interface |
US11593915B2 (en) | 2020-10-21 | 2023-02-28 | Zebra Technologies Corporation | Parallax-tolerant panoramic image generation |
US11392891B2 (en) | 2020-11-03 | 2022-07-19 | Zebra Technologies Corporation | Item placement detection and optimization in material handling systems |
US11847832B2 (en) | 2020-11-11 | 2023-12-19 | Zebra Technologies Corporation | Object classification for autonomous navigation systems |
US11744403B2 (en) | 2021-05-01 | 2023-09-05 | Miso Robotics, Inc. | Automated bin system for accepting food items in robotic kitchen workspace |
US11954882B2 (en) | 2021-06-17 | 2024-04-09 | Zebra Technologies Corporation | Feature-based georegistration for mobile computing devices |
US20230274225A1 (en) * | 2022-01-31 | 2023-08-31 | Walmart Apollo, Llc | Methods and apparatus for generating planograms |
WO2023168093A3 (en) * | 2022-03-04 | 2023-10-12 | Agile Displays Llc | Shopping store planning and operations using large-scale distributed radio infrastructure |
US11893612B2 (en) | 2022-03-04 | 2024-02-06 | Agile Displays Llc | Shopping store planning and operations using large-scale distributed radio infrastructure |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090192921A1 (en) | Methods and apparatus to survey a retail environment | |
CN109214751B (en) | Intelligent inventory management system based on inventory position change | |
US11393213B2 (en) | Tracking persons in an automated-checkout store | |
US20150095189A1 (en) | System and method for scanning, tracking and collating customer shopping selections | |
US7606728B2 (en) | Shopping environment analysis system and method with normalization | |
US8229781B2 (en) | Systems and apparatus to determine shopper traffic in retail environments | |
CN107782316B (en) | The track of target object determines method, apparatus and system | |
US20160063517A1 (en) | Product exposure analysis in a shopping environment | |
JP2009048430A (en) | Customer behavior analysis device, customer behavior determination system, and customer buying behavior analysis system | |
KR102096230B1 (en) | Determining source lane of moving item merging into destination lane | |
JP2005309951A (en) | Sales promotion support system | |
JP6205484B2 (en) | Behavior analysis device | |
CN106462881A (en) | System and method for determining demographic information | |
US10818031B2 (en) | Systems and methods of determining a location of a mobile container | |
JP2004118453A (en) | Salesroom integrated monitoring system | |
US10475321B2 (en) | Cart wheel failure detection systems and methods | |
JPH11175597A (en) | Merchandise selection behavior information calculating method and its execution system | |
WO2009094031A1 (en) | Methods and apparatus to survey a retail environment | |
Karunarathne et al. | Understanding a public environment via continuous robot observations | |
US20200202553A1 (en) | Information processing apparatus | |
CN111339929A (en) | Retail system of unmanned supermarket | |
US9230275B1 (en) | Systems and methods for merchandising | |
CN111680657A (en) | Method, device and equipment for determining triggering personnel of article picking and placing event | |
Lee et al. | Understanding human-place interaction from tracking and identification of many users | |
Kröckel et al. | Visual customer behavior analysis at the point of sale |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NIELSEN MEDIA RESEARCH, INC., NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HICKS, MICHAEL ALAN;REEL/FRAME:020441/0890 Effective date: 20080123 |
|
AS | Assignment |
Owner name: NIELSEN COMPANY (US), LLC, THE, A DELAWARE LIMITED Free format text: MERGER;ASSIGNOR:NIELSEN MEDIA RESEARCH, LLC (FORMERLY KNOWN AS NIELSEN MEDIA RESEARCH, INC.), A DELAWARE LIMITED LIABILITY COMPANY;REEL/FRAME:023084/0570 Effective date: 20081001 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |