US20070247422A1 - Interaction techniques for flexible displays - Google Patents

Interaction techniques for flexible displays Download PDF

Info

Publication number
US20070247422A1
US20070247422A1 US11/731,447 US73144707A US2007247422A1 US 20070247422 A1 US20070247422 A1 US 20070247422A1 US 73144707 A US73144707 A US 73144707A US 2007247422 A1 US2007247422 A1 US 2007247422A1
Authority
US
United States
Prior art keywords
flexible
display
flexible display
content
flexible surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/731,447
Inventor
Roel Vertegaal
David Holman
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xuuk Inc
Original Assignee
Xuuk Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xuuk Inc filed Critical Xuuk Inc
Priority to US11/731,447 priority Critical patent/US20070247422A1/en
Publication of US20070247422A1 publication Critical patent/US20070247422A1/en
Priority to US12/459,973 priority patent/US20100045705A1/en
Priority to US13/228,681 priority patent/US8466873B2/en
Priority to US13/589,732 priority patent/US20130127748A1/en
Priority to US13/919,046 priority patent/US20140085184A1/en
Priority to US14/314,589 priority patent/US20150309611A1/en
Priority to US15/293,419 priority patent/US20170224140A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected

Definitions

  • the present invention relates generally to input and interaction techniques associated with flexible display devices.
  • U.S. Pat. No. 6,639,578 cites a process for creating an electronically addressable display that includes multiple printing operations, similar to a multi-color process in conventional screen printing.
  • U.S. Pat. Application No. 2006/0007368 cite a display device assembly comprising a flexible display device being rollable around an axis.
  • a range of flexible electronic devices based on these technologies, including full color, high-resolution flexible OLED displays with a thickness of 0.2 mm are being introduced to the market ( 14 ). The goal of such efforts is to develop displays that resemble the superior handling, contrast and flexibility of real paper.
  • an apparatus for tracking interaction techniques for flexible displays that uses a projection apparatus that projects images generated by a computer onto real paper, of which the shape is subsequently measured using a computer vision device. Deformation of the shape of the paper display is then used to manipulate in real time said images and/or associated computer functions displayed on said display.
  • the category of displays to which this invention pertains is very different from the type of rigid-surface LCD displays cited in, for example, U.S. Pat. Nos. 6,567,068 or 6,573,883 which can be rotated around their respective axes but not deformed.
  • SmartSkin ( 18 ) is an interactive surface that is sensitive to human finger gestures. With SmartSkin, the user can manipulate the contents of a digital back-projection desk using manual interaction.
  • Rekimoto's Pick and Drop ( 16 ) is a system that lets users drag and drop digital data among different computers by projection onto a physical object.
  • Ishii's Tangible User Interface (TUI) paradigm ( 5 )
  • interaction with projected digital information is provided through physical manipulation of real-world objects.
  • the input device is not the actual display itself, or the display is not on the actual input device.
  • DataTiles ( 17 )
  • Rekimoto et. al. proposed the use of plastic surfaces as widgets that with touch-sensitive control properties for manipulating data projected onto other plastic surfaces.
  • the display surfaces are again two-dimensional and rigid body.
  • DigitalDesk a physical desk is augmented with electronic input and display.
  • a computer controlled camera and projector are positioned above the desk.
  • Image processing is used to determine which page a user is pointing at.
  • Object character recognition transfers content between real paper and electronic documents projected on the desk.
  • Wellner demonstrates the use of his system with a calculator that blurs the boundaries between the digital and physical world by taking a printed number and transferring it into an electronic calculator.
  • Interactive Paper ( 11 ) provides a framework for three prototypes.
  • Ariel 11
  • Video Mosaic 11
  • a paper storyboard is used to edit video segments.
  • Paper Augmented Digital Documents are digital documents that are modified on a computer screen or on paper. Digital copies of a document are maintained in a central database and if needed, printed to paper using IR transparent ink. This is used to track annotations to documents using a special pen.
  • Insight Lab ( 9 ) is an immersive environment that seamlessly supports collaboration and creation of design requirement documents. Paper documents and whiteboards allow group members to sketch, annotate, and share work. The system uses bar code scanners to maintain the link between paper, whiteboard printouts, and digital information.
  • Xlibris uses a tablet display and paper-like interface to include the affordances of paper while reading. Users can read a scanned image of a page and annotate it with digital ink. Annotations are captured and used to organize information. Scrolling has been removed from the system: pages are turned using a pressure sensor on the tablet. Users can also examine a thumbnail overview to select pages. Pages can be navigated by locating similar annotations across multiple documents. Fishkin et al. ( 2 ) describe embodied user interfaces that allow users to use physical gestures like page turning, card flipping, and pen annotation for interacting with documents. The system uses physical sensors to recognize these gestures. Due to space limitations we limit our review: other systems exist that link the digital and physical world through paper.
  • Examples include Freestyle ( 10 ), Designers' Outpost ( 8 ), Collaborage ( 12 ), and Xax ( 6 ).
  • Freestyle 10
  • Designers' Outpost 8
  • Collaborage 12
  • Xax 6
  • One feature common to prior work in this area is the restriction of the use of physical paper to a flat surface. Many project onto or sense interaction in a coordinate system based on a rigid 2 D surface only. In our system, by contrast, we use as many of the three dimensional affordances of flexible displays as possible.
  • Piper et al. proposed the use of a laser scanner to determine the deformation of a clay mass. This deformation was in turn used to alter images projected upon the clay mass through a projection apparatus.
  • the techniques presented in this patent are different in a number of ways. Firstly, our display unit is completely flexible, can be duplicated to work in unison with other displays of the same type and move freely in three-dimensional space. They can be folded 180 degrees around any axis or sub-axes, and as such completely implement the functionality of two-sided flexible displays. Secondly, rather than determining the overall shape of the object as a point cloud, our input techniques rely on determining the 3D location of specific marker points on the display.
  • Paper can be moved in and out of work contexts with much greater ease than with current displays. Unlike GUI windows or rigid LCD displays, paper can be folded, rotated and stacked along many degrees of freedom ( 7 ). It can be annotated, navigated and shared using extremely simple gestural interaction techniques. Paper allows for greater flexibility in the way information is represented and stored, with a richer set of input techniques than currently possible with desktop displays. Conversely, display systems currently support properties unavailable in physical paper, such as easy distribution, archiving, querying and updating of documents. By merging the digital world of computing with the physical world of flexible displays we increase value of both technologies.
  • the present invention relates to a set of interaction techniques for obtaining input to a computer system based on methods and apparatus for detecting properties of the shape, location and orientation of flexible display surfaces, as determined through manual or gestural interactions of a user with said display surfaces.
  • Such input may be used to alter graphical content and functionality displayed on said surfaces or some other display or computing system.
  • One aspect of the invention is a set of interaction techniques for manipulating graphical content and functionality displayed on flexible displays based on methods for detecting the shape, location and orientation of said displays in 3 dimensions and along 6 degrees of freedom, as determined through manual or gestural interaction by a user with said display.
  • Another aspect of the invention is a capture and projection system, used to simulate or otherwise implement a flexible display. Projecting computer graphics onto physical flexible materials allows for a seamless integration between images and multiple 3D surfaces of any shape or form, one that measures and corrects for 3D skew in real time.
  • Another aspect of the invention is the measurement of the deformation, orientation and/or location of flexible display surfaces, for the purpose of using said shape as input to the computer system associated with said display.
  • a Vicon Motion Capturing System ( 23 ) or equivalent computer vision system is used to measure the location in three dimensional space of retro-reflective markers affixed to or embedded within the surface of the flexible display unit.
  • movement is tracked through wireless accelerometers embedded into the flexible display surface in lieu of said retro-reflective markers, or deformations are tracked through some fiber optics embedded in the display surface.
  • One embodiment of the invention is the application of said interaction techniques to flexible displays that resemble paper.
  • the interaction techniques are applied to any form of polymer or organic light emitting diode-based electronic flexible display technology.
  • Another embodiment of the invention is the application of said interaction techniques to flexible displays that mimic or otherwise behave as materials other than paper, including but not limited to textiles whether or not worn on the human body, three-dimensional objects, liquids and the likes.
  • interaction techniques apply to projection on the skin of live or dead human bodies, the shape of which is sensed via computer vision or embedded accelerometer devices.
  • FIG. 1 shows a Hold Gesture with flexible display surface ( 1 ).
  • flexible display surfaces and fingers in FIG. 1 through 10 may include some (hidden) marker(s) ( 3 ) according to FIG. 11 or FIG. 12 that have not been included in the drawings for reasons of clarity.
  • FIG. 2 shows a Collocate Gesture with flexible display surfaces ( 1 ).
  • FIG. 3 shows a Collate Gesture with flexible display surfaces ( 1 ).
  • FIG. 4 shows a Flip Gesture, Fold and Half-fold Gestures with flexible display surface ( 1 ).
  • FIG. 5 shows a Roll Gesture with flexible display surface ( 1 ) with markers ( 3 ).
  • FIG. 6 shows a Bend Gesture with flexible display surface ( 1 ) and foldline ( 2 ).
  • FIG. 7 shows a Rub Gesture with flexible display surface ( 1 ).
  • FIG. 8 shows a Staple Gesture with flexible display surface ( 1 ).
  • FIG. 9 shows a Pointing Gesture with flexible display surface ( 1 ).
  • FIG. 10 shows a Multi-handed Pointing Gesture with flexible display surface ( 1 ).
  • FIG. 11 shows a Flexible display surface ( 1 ) with markers ( 3 ).
  • FIG. 12 shows another embodiment of flexible display surface ( 1 ) made of fabric or similar materials with markers ( 3 ).
  • FIG. 13 shows a System apparatus for tracking flexible display surface ( 1 ) through computer vision cameras emitting infrared light ( 4 ) mounted above a workspace with user ( 7 ), where markers ( 3 ) affixed to flexible display surface ( 1 ) reflect infrared light emitted by computer vision cameras ( 4 ).
  • digital projection system ( 5 ) projects images of the modeled flexible display surfaces rendered with textures back onto said flexible display surfaces.
  • “Flexible Display” or “Flexible Display Surface” means any display surface made of any material, including, but not limited to displays constituted by projection and including, but not limited to real and electronic paper known in the art, based on Organic Light Emitting Devices or other forms of thin, thin-film or e-ink based technologies such as, e.g., described in U.S. Pat. No.
  • Paper Window refers to one embodiment of a flexible display surface implemented by tracking the shape, orientation and location of a sheet of paper, projecting back and image onto said sheet of paper using a projection system, such that it constitutes a flexible electronic display.
  • the term is may be interpreted as interchangeable with flexible display, flexible display surface or document, but the terms flexible display, document and flexible display surface shall not be limited to such interpretation.
  • Marker refers to a device that is affixed to a specific location on a flexible display surface for the purpose of tracking the position or orientation of said location on said surface.
  • Said marker may consist of a small half-sphere made of material that reflects light in the infrared spectrum for the purpose of tracking location with an infrared computer vision camera.
  • Said marker may also consist of an accelerometer that reports to a computer system for the purpose of computing the location of said marker, or any other type of location tracking system known in the art.
  • point A similar term used in this context is “point.”
  • “Fold” is synonymous with “Bend,” wherein folding is interpreted to typically be limited to a horizontal or vertical axis of the surface, whereas Bends can occur along any axis ( 2 ). Folding does not necessary lead to a crease.
  • Position and shape of flexible displays can be adjusted for various tasks: these displays can be spread about the desk, organized in stacks, or held close for a detailed view.
  • Direct manipulation takes place with the paper display itself: by selecting and pointing using the fingers, or with a digital pen.
  • the grammar of the interaction styles provided by this invention follows that of natural manipulation of paper and other flexible materials that hold information.
  • FIGS. 1 through 10 show a set of gestures based on deformations and location of the flexible display(s). These gestures provide the basic units of interaction with the system:
  • the currently held display is the active document ( FIG. 1 ).
  • FIG. 2 shows the use of spatial arrangement of the flexible display(s) for organizing or rearranging information on said display(s).
  • collocating multiple flexible displays allows image contents to be automatically spread or enlarged across multiple flexible displays that are collocated.
  • FIG. 3 shows how users may stack flexible displays, organizing said displays in piles on a desk.
  • Such physical organization is reflected in the digital world by semantically associating or otherwise relating computer content of the displays, be it files, web-based or other information, located in a database, on a server, file system or the like, for example, by sorting such computer content according to some property of the physical organization of the displays.
  • FIG. 4 shows how users may flip or turn the flexible display by folding it over its x or y axis, thus revealing the other side of the display.
  • Flipping or turning the flexible display around an axis may reveal information that is stored contiguously to the information displayed on the edge of the screen. Note that this flipping or turning gesture is distinct from that of rotating a rigid display surface, in that the folds that occur in the display in the process of turning or flipping the display around its axes are used in detecting said turn or flip.
  • a flip gesture around the x axis may, in a non-limiting example, scroll the associated page content in the direction opposite to that of the gesture.
  • the flexible display is flipped around the x axis, such that the bottom of the display is lifted up, then folder over to the top.
  • the associated graphical content scrolls down, thus revealing content below what is currently displayed on the display.
  • the opposite gesture, lifting the top of the display, then folding it over to the bottom of the display causes content to scroll up, revealing information above what is currently displayed.
  • flipping gestures around the x-axis may be used by the application to navigate to the prior or next page of said document, pending the directionality of the gesture.
  • said gesture may be used to navigate to the previous or next page of the browsing history, pending the directionality of the gesture.
  • the flexible display is flipped around the y axis, such that the right hand side of the display is folded up, then over to the left. This may cause content to scroll to the right, revealing information to the right of what is currently on display.
  • the opposite gesture, folding the left side of the display up then over to the right, may cause content to scroll to the left, revealing information to the left of what is currently on display.
  • flipping gestures around the y-axis may be used by the application to navigate to the prior or next page of said document, pending the directionality of the gesture.
  • said gesture may be used to navigate to the previous or next page of the browsing history, pending the directionality of the gesture.
  • Fold Note that wherever the term “Fold” is used it can be substituted for the term “Bend” and vice versa, wherein folding is interpreted to typically be limited to a horizontal or vertical axes of the surface. Where folding a flexible display around either or both its horizontal or vertical axis, either in sequence or simultaneously, serves as a means of input to the software that alters the image content of the document, or affects associated computing functionality (see FIG. 4 ). As a non-limiting example, this may cause objects displayed in the document to be moved to the center of gravity of the fold, or sorted according to a property displayed in the center of gravity of the fold. As another non-limiting example, following the gravity path of the fold that would exist if water was run through that fold, it may cause objects to be moved from one flexible display to a second flexible display placed underneath it.
  • Semi-permanent fold Where the act of folding a flexible display around either its horizontal or vertical axis, or both, in such way that it remains in a semi-permanent folded state after release, serves as input to a computing system.
  • folding causes any contents associated with flexible displays to be digitally archived.
  • the unfolding of the flexible display causes any contents associated with said flexible display to be un-archived and displayed on said flexible display.
  • said flexible display would reduce its power consumption upon a semi-permanent fold, increasing power consumption upon unfold ( FIG. 4 ).
  • Roll Where the act of changing the shape of a flexible display such that said shape transitions from planar to cylindrical or vice versa serves as input to a computing system.
  • this causes any contents associated with the flexible display to be digitally archived upon a transition from planar to cylindrical shape (rolling up), and to be un-archived and displayed onto said flexible display upon a transition from cylindrical to planar shape (unrolling).
  • rolling up a display causes it to turn off, while unrolling a display causes it to turn on, or display content ( FIG. 5 ).
  • Bend Where bending a flexible display around any axes serves as input to a computing system. Bend may produce some visible or invisible fold line ( 2 ) that may be used to select information on said display, for example, to determine a column of data properties in a spreadsheet that should be used for sorting.
  • a bending action causes graphical information to be transformed such that it follows the curvature of the flexible display, either in two or three dimensions. The release of a bending action causes the contents associated with the flexible display to be returned to its original shape. Alternatively, deformations obtained through bending may become permanent upon release of the bending action. (See FIG. 6 ).
  • the rubbing gesture allows users to transfer content between two or more flexible displays, or between a flexible display and a computing peripheral (see FIG. 7 ).
  • the rubbing gesture is detected by measuring back and forth motion of the hand on the display, typically horizontally. This gesture is typically interpreted such that information from the top display is transferred, that is either copied or moved, to the display(s) or peripheral(s) directly beneath it.
  • the top display is not associated with any content (i.e., is empty) it becomes the destination and the object directly beneath the display becomes the source of the information transfer.
  • the rubbing gesture would cause its content to be printed on said printer.
  • the active window on that screen will be transferred to the flexible display such that it displays on said display.
  • the flexible display contains content
  • said content is transferred back to the computer screen instead.
  • the rubbing gesture applied to the top display, causes information to be copied from the top to the bottom display if the top display holds content, and from the bottom to the top display if the top display is empty.
  • information transfer may be limited to those graphical objects that are currently selected on the source display.
  • Staple Like a physical staple linking a set of pages, two or more flexible displays may be placed together such that one impacts the second with a detectable force that is over a set threshold (see FIG. 8 ).
  • This gesture may be used to clone the information associated with the moving flexible display onto the stationary destination document, given that the destination flexible display is empty. If the destination display is not empty, the action shall be identical to that of the collate gesture.
  • Point Users can point at the content of a paper window using their fingers or a digital pen (see FIG. 9 ). Fingers and pens are tracked by either computer vision, accelerometers, or some other means. Tapping the flexible display once performs a single click. A double click is issued by tapping the flexible display twice in rapid succession.
  • Two-handed Pointing allows users to select disjoint items on a single flexible display, or across multiple flexible displays that are collocated (see FIG. 10 ).
  • the active document is selected for editing by clicking on its corresponding window. If only one window is associated with one flexible display, the hold gesture can be used to activate that window, making it the window that receives input operations.
  • the flexible display remains active until another flexible display is picked up and held by the user.
  • Select. Items on a flexible display can be selected through a one-handed or two-handed pointing gesture.
  • a user opens an item on a page for detailed inspection by pointing at it, and tapping it twice.
  • Two-handed pointing allows parallel use of the hands to select disjoint items on a page.
  • sets of icons can be grouped quickly by placing one finger on the first icon in the set and then tapping one or more icons with the index finger of the other hand.
  • flexible displays are placed on a flat surface when performing this gesture.
  • Two-handed pointing can also be used to select items using rubber banding techniques. With this technique, any items within the rubber band, bounded by the location of the two finger tips, are selected upon release.
  • objects on a screen can be selected as those located on a foldline or double foldline ( 2 ) produced by bends (see FIG. 6 ).
  • Copy & Paste In GUIs, copying and pasting of information is typically performed using four discrete steps: ( 1 ) specifying the source, ( 2 ) issuing the copy, ( 3 ) specifying the destination of the paste and ( 4 ) issuing the paste. In flexible displays, these actions can be merged into simple rubbing gestures:
  • Computer windows can be transferred to a flexible display by rubbing a blank flexible display onto the computer screen.
  • the window content is transferred to the flexible display upon peeling the flexible display off the computer screen.
  • the process is reversed when transferring a document displayed on a flexible display back to the computer screen.
  • Copy Between Displays Users can copy content from one flexible display to the next. This is achieved by placing a flexible display on top of a blank display. The content of the source page is transferred by rubbing it onto the blank display. If prior selections exist on the source page, only highlighted items are transferred.
  • Scroll Users can scroll through content of a flexible display in discrete units, or pages. Scrolling action is initiated by half-folding, or folding then flipping the flexible displays around its horizontal or vertical axis with a flip or fold gesture. In a non-limiting example, this causes the next page in the associated content to be displayed on the back side of the flexible display. Users can scroll back by reversing the flip.
  • Flips or folds around the horizontal or vertical axis may also be used to specify back and forward actions that are application specific. For example, when browsing the web, a left flip may cause the previous page to be loaded. To return to the current page, users would issue a right flip.
  • the use of spatially orthogonal flips allows users to scroll and navigate a document independently.
  • the staple gesture can be used to generate parallel copies of a document on multiple flexible displays. Users can open a new view into the same document space by issuing a staple gesture impacting a blank display with a source display. This, for example, allows users to edit disjoint parts of the document simultaneously using two separate flexible displays. Alternatively, users can display multiple pages in a document simultaneously by placing a blank flexible display beside a source flexible display, thus enlarging the view according to the collocate gesture. Rubbing across both displays causes the system to display the next page of the source document onto the blank flexible display that is beside it.
  • Documents projected on a flexible display can be scaled using one of two techniques. Firstly, the content of a display can be zoomed within the document. Secondly, users can transfer the source material to a flexible display with a larger size. This is achieved by rubbing the source display onto a larger display. Upon transfer, the content automatically resizes to fit the larger format.
  • Users can use flexible displays, or other objects, including computer peripherals such as scanners and copiers as digital stationary. Stationary pages are blank flexible displays that only display a set of application icons. Users can open a new document on the flexible display by tapping an application icon. Users may retrieve content from a scanner or email appliance by rubbing it onto said scanner or appliance. Users may also put the display or associated computing resources in a state of reduced energy use through a roll or semi-permanent fold gesture, where said condition is reversed upon unrolling or unfolding said display.
  • a document is saved by performing the rubbing gesture on a single flexible display, typically while it is placed on a surface.
  • Content displayed on a flexible display may be closed by transferring its contents to a desktop computer using a rubbing gesture. Content may be erased by crumbling or shaking the flexible display.
  • a real piece of flexible, curved or three-dimensional material such as a cardboard model, piece of paper, textile or human skin may be tracked using computer vision, modeled, texture mapped and then projected back upon the object.
  • the computer vision methods may simply be used to track the shape, orientation and location of a flexible display that does not require the projection component. This in effect implements a projected two-sided flexible display surface that follows the movement, shape and curves of any object in six degrees of freedom.
  • FIGS. 10 and 11 An overview of the elements required for such embodiment of the flexible display ( 1 ) is provided in FIGS. 10 and 11 .
  • the surface is augmented with infrared (IR) reflective marker dots ( 3 ).
  • FIG. 13 shows the elements of the capture and projection system, where the fingers ( 6 ) of the user ( 7 ) are tracked by affixing three or more IR marker dots to the digit.
  • a digital projection unit ( 5 ) allows for projection of the image onto the scene, and a set of infrared or motion capturing cameras ( 4 ) allows tracking of the shape orientation and location of the sheets of paper.
  • the following section discusses each of the above apparatus elements, illustrating their relationship to other objects in this embodiment of the system. This example does not withstand other possible embodiments of the apparatus, which include accelerometers embedded in lieu of the marker dots, and mounted on flexible displays. In such embodiment, the wireless accelerometers report acceleration of the marked positions of the material in three dimensions to a host computer so as to determine their absolute or relative location.
  • the computer vision component uses a Vicon ( 23 ) tracker or equivalent computer vision system that can capture three dimensional motion data of retro-reflective markers mounted on the material.
  • Our setup consists of 12 cameras ( 4 ) that surround the user's work environment, capturing three dimensional movement of all retro-reflective markers ( 3 ) within a workspace of 20′ ⁇ 10′ (see FIG. 13 ).
  • the system uses the Vicon data to reconstruct a complete three-dimensional representation that maps the shape, location and orientation of each flexible display surface in the scene.
  • an initial process of modeling the flexible display is required before obtaining the marker data.
  • a Range of Motion (ROM) trial is captured that describes typical movements of the flexible display through the environment. This data is used to reconstruct a three dimensional model that represents the flexible display.
  • Vicon software calibrates the ROM trial to the model and uses it to understand the movements of the flexible display material during a real-time capture, effectively mapping each marker dot on the surface to a corresponding location on the model of the flexible display in memory.
  • sample code that is available as part of Vicon's Real Time Development Kit ( 23 ).
  • each flexible display surface within the workspace is augmented with IR reflective markers, accelerometers and/or optic fibres that allow shape, deformation, orientation and location of said surface to be computed.
  • the markers are affixed to form an eight point grid (see FIGS. 10 and 11 ).
  • a graphics engine interfaces with the Vicon server, which streams marker data to our modeling component.
  • coordinates or relative coordinates of the markers are computed from the acceleration of said markers, and provided to our modeling component. The modeling component subsequently constructs a three-dimensional model in OpenGL of each flexible display surface that is tracked by the system.
  • the center point of the flexible display surface is determined by averaging between the markers on said surface.
  • Bezier curve analysis of marker locations is used to construct a fluid model of the flexible display surface shape, where Bezier control points correspond with the location of markers on the display surface. Subsequent analysis of the movement of said surface is used to detect the various gestures.
  • Application windows that provide content to the flexible displays run on an associated computer.
  • the flexible display surface consists of a polymer flexible display capable of displaying data without projection
  • application windows are simply transferred and displayed on said display.
  • application windows are first rendered off-screen into the OpenGL graphics engine.
  • the graphics engine performs real-time screen captures, and maps a computer image to the three dimensional OpenGL model of the display surface.
  • the digital projector then projects an inverse camera view back onto the flexible display surface.
  • Back projecting the transformed OpenGL model automatically corrects for any skew caused by the shape of the flexible display surface, effective synchronizing the two.
  • the graphics engine similarly models fingers and pens in the environment, posting this information to the off-screen window for processing as cursor movements.
  • input from pens, fingers or other input devices can be obtained through other methods known in the art.
  • fingers ( 6 ) of the user ( 7 ) are tracked by augmenting them with 3 IR reflective markers ( 3 ). Sensors are placed evenly from the tip of the finger up to the base knuckle. Pens are tracked similarly throughout the environment. The intersection of a finger or pen with a flexible display surface is calculated using planar geometry. When the pen or finger is sufficiently close, its tip is projected onto the plane of the flexible display surface. The position of the tip is then related to the length and width of the display. The x and y position of the point on the display ( 1 ) is calculated using simple trigonometry. When the pen or finger touches the display, the input device is engaged.
  • a projected flexible display computer images or windows are rendered onto the paper by a digital projector ( 5 ) positioned above the workspace.
  • the projector is placed such that it allows a clear line of sight with the flexible display surface between zero and forty-five degrees of visual angle.
  • Using one projector introduces a set of tradeoffs. For example, positioning the projector close to the scene improves the image quality but reduces the overall usable space, and vice versa.
  • a set of multiple projectors can be used to render onto the flexible display surface as it travels throughout the environment of the user.
  • a calibration procedure is required to pair the physical position of the flexible display surface and the digital output of the projector. This is accomplished by adjusting the position, rotation, and size of the projector output until it matches the dimensions of the physical display surface.
  • the term “marker” is interchangeable with the term “accelerometer”. Understanding the physical motion of paper and other materials in the system requires a combination of approaches. For gestures such as stapling, it is relatively easy to recognize when two flexible displays are rapidly moved towards each other. However, flipping requires knowledge of a flexible display surface's prior state. To recognize this event, the z location of markers at the top and bottom of the page is tracked. During a vertical or horizontal half-rotation, the relative location on the z dimension is exchanged between markers. The movement of the markers is compared to their previous position to determine the direction of the flip, fold or bend.
  • marker data is recorded over multiple trials and then isolated in the data. Once located, the gesture is normalized and is used to calculate a distance vector for each component of the fingertip's movement. The system uses this distance vector to establish a confidence value. If this value passes a predetermined threshold the system recognizes the gesture, and if such gesture occurs near the display surface, a rubbing event is issued to the application.
  • One such non-limiting example is the selection of photos for printout from a digital photo database containing raw footage.
  • Our design was inspired by the use of contact sheets by professional photographers. Users can compose a photo collage using two flexible displays, selecting a photo on one overview display and then rubbing it onto the second display with a rubbing gesture.
  • This scenario shows the use of flexible display input as a focus and context technique, with one display providing a thumbnail overview of the database, and the other display offering a more detailed view.
  • thumbnails Users can select thumbnails by pointing at the source page, or by selecting rows through producing a foldline with a bend gesture. By crossing two fold lines, a single photo or object may be selected. Thumbnails that appear rotated can be turned using a simple pivoting action of the index finger. After selection, thumbnails are transferred to the destination page through a rubbing gesture. After the copy, thumbnails may resize to fit the destination page.
  • the content of the destination flexible display can be printed by performing a rubbing gesture onto a printer. The printer location is tracked similarly to that of the flexible display, and is known to the system.
  • Gestures supported by the invention can also be used to edit photos prior to selection. For example, photos are cropped by selecting part of the image with a two-handed gesture, and then rubbing the selection onto a destination flexible display. Photos can be enlarged by rubbing them onto a larger flexible display.
  • the invention is used to implement a computer game that displays its graphic animations onto physical game board pieces.
  • Said pieces may consist of cardboard that is tracked and projected upon using the apparatus described in this invention, or electronic paper, LCD, e-ink, OLED or other forms of thin, or thin-film displays.
  • the well-known board game Settlers of Catan consists of a game board design in which hexagonal pieces with printed functionality can be placed differently in each game, allowing for a game board that is different each game.
  • Each hexagonal piece, or hex represents a raw material or good that can be used to build roads or settlements, which is the purpose of the game.
  • each hex is replaced by a flexible display of the same shape, the position and orientation of which is tracked through the hexes such that a board is formed.
  • a computer algorithm then renders the functionality onto each flexible display hex. This is done through a computer algorithm that calculates and randomizes the board design each time, but within and according to the rules of the game.
  • the graphics on the hexes is animated with computer graphics that track and represent the state of the game. All physical objects in the game are tracked by the apparatus of our invention and can potentially be used as display surfaces. For example, when a user rolls a die, the outcome of said roll is known to the game. Alternatively, the system may roll the die for the user, representing the outcome on a cube-shaped flexible display that represents the cast die.
  • the number provided by said die indicates the hex that is to produce goods for the users.
  • a lumberjack may be animated to walk onto the hex to cut a tree, thus providing the wood resource to a user.
  • city and road objects may be animated with wagons and humans after they are placed onto the hex board elements.
  • Hex elements that represent ports or seas may be animated with ships that move goods from port to port. Animations may trigger behavior in the game, making the game more challenging. For example, a city or port may explode, requiring the user to take action, such as rebuild the city or port.
  • a resource may be depleted, which is represented by a woodland hex slowly turning into a meadow hex, and a meadow hex slowly turning into a desert hex that is unproductive.
  • climate may be simulated, allowing users to play the game under different seasonal circumstances, thus affecting their constraints. For example, during winters, ports may not be in use.
  • This invention allows the functionality of pc-based or online computer games known in the art, such as Simcity, The Sims, World of Warcraft, or Everquest to be merged with that of physical board game elements.
  • the invention is used to provide display on any three dimensional object, such that it allows animation or graphics rendering on said three dimensional object.
  • the invention may be used to implement a rapid prototyping environment for the design of electronic appliance user interfaces, such as, for example, but not limited to, the Apple iPod.
  • One element of such embodiment is a three dimensional model of the appliance, made out of card board, Styrofoam, or the like, and either tracked and projected upon using the apparatus of this invention or coated with electronic paper, LCD, e-ink, OLED or other forms of thin, or thin-film displays, such that the shapes and curvatures of the appliance are followed.
  • Another flexible display apparatus described in this invention is another flexible display apparatus described in this invention.
  • the flexible display surface acts as a palette on which user interface elements such as displays and dials are displayed. These user interface elements can be selected and picked up by the user by tapping its corresponding location on the palette display. Subsequent tapping on the appliance model places the selected user interface element onto the appliance's flexible display surface.
  • User interface elements may be connected or associated with each other using a pen or finger gesture on the surface of the model. For example, a dial user interface element may be connected to a movie user interface element on the model, such that said dial, when activated, causes a scroll through said movie.
  • each element of the architectural model consists of a flexible display surface.
  • one flexible display surface may be shaped as a wall element, while another flexible display surface may be shaped as a roof element that are physically placed together to form the larger architectural model.
  • Another flexible display surface acts as a palette on which the user can select colors and materials.
  • the flexible display architectural model can be animated such that living or physical conditions such as seasons or wear and tear can be simulated.
  • the flexible display model represents a product packaging.
  • the palette containing various graphical elements that can be placed on the product packaging, for example, to determine the positioning of typographical elements on the product.
  • product packaging may itself contain or consist of one or multiple flexible display surfaces, such that the product packaging can be animated or used to reflect some computer functionality, including but not limited to online content, messages, RSS feeds, animations, TV shows, newscasts, games and the like.
  • users may tap the surface of a soft drink or food container with an embedded flexible display surface to play a commercial advertisement or TV show on said container, or to check electronic messages. Users may rotate the container to scroll through content on its display, or use a rub gesture to scroll through content.
  • the product packaging is itself used as a pointing device, that allows users to control a remote computer system.
  • the flexible display surface consists of electronic textile displays such as but not limited to OLED textile displays known in the art, or white textiles that are tracked and projected upon using the apparatus of this invention.
  • These textile displays may be worn by a human, and may contain interactive elements such as buttons, as per Example 3.
  • the textile is worn by a human and the display is used by a fashion designer to rapidly prototype the look of various textures, colors or patterns of fabric on the design, in order to select said print for a dress or garment made out of real fabric.
  • said textures on said flexible textile displays are permanently worn by the user and constitute the garment.
  • said flexible display garment may display messages that are sent to said garment through electronic means by other users, or that represent advertisements and the like.
  • the flexible textile display is worn by a patient in a hospital, and displays charts and images showing vital statistics, including but not limited to x-ray, ct-scan, or MRI images of said patient. Doctors may interact with user interface elements displayed on said flexible textile display through any of the interaction techniques of this invention and any technique know in prior art. This includes tapping on buttons or menus displayed on said display to select different vital statistics of said patient.
  • the flexible textile display is draped on a patient in surgery to show models or images including but not limited to x-ray, ct-scan, MRI or video images of elements inside the patients body to aid surgeons in, for example, pinhole surgery and minimally invasive operations. Images of various regions in the patient's body may be selected by moving the display to that region.
  • images of vital statistics, x-rays, ct-scans, MRIs, video images and the likes may be projected directly onto a patient to aid or otherwise guide surgery.
  • the human skin itself functions as a display through projection onto said skin, and through tracking the movement and shape of said skin by the apparatus of invention.
  • Such images may contain user interface elements that can be interacted with by a user through techniques of this invention, and those known in the art. For example, tapping a body element may bring up a picture of the most recent x-ray of that element for display, or may be used as a form of input to a computer system.
  • the flexible surface with markers is used as input to a computer system that displays on a standard display that is not said flexible surface, allowing use of said flexible surface and the gestures in this invention as an input device to a computing system.

Abstract

The invention relates to a set of interaction techniques for obtaining input to a computer system based on methods and apparatus for detecting properties of the shape, location and orientation of flexible display surfaces, as determined through manual or gestural interactions of a user with said display surfaces. Such input may be used to alter graphical content and functionality displayed on said surfaces or some other display or computing system.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority from U.S. Provisional Application Ser. No. 60/788,405, filed on Mar. 30, 2006.
  • Each of the applications and patents cited in this text, as well as each document or reference cited in each of the applications and patents (including during the prosecution of each issued patent; “application cited documents”), and each of the U.S. and foreign applications or patents corresponding to and/or claiming priority from any of these applications and patents, and each of the documents cited or referenced in each of the application cited documents, are hereby expressly incorporated herein by reference. More generally, documents or references are cited in this text, either in a Reference List before the claims, or in the text itself; and, each of these documents or references (“herein-cited references”), as well as each document or reference cited in each of the herein-cited references (including any manufacturer's specifications, instructions, etc.), is hereby expressly incorporated herein by reference. Documents incorporated by reference into this text may be employed in the practice of the invention.
  • FIELD OF THE INVENTION
  • The present invention relates generally to input and interaction techniques associated with flexible display devices.
  • BACKGROUND OF THE INVENTION
  • In recent years, considerable progress has been made towards the development of thin and flexible displays. U.S. Pat. No. 6,639,578 cites a process for creating an electronically addressable display that includes multiple printing operations, similar to a multi-color process in conventional screen printing. Likewise, U.S. Pat. Application No. 2006/0007368 cite a display device assembly comprising a flexible display device being rollable around an axis. A range of flexible electronic devices based on these technologies, including full color, high-resolution flexible OLED displays with a thickness of 0.2 mm are being introduced to the market (14). The goal of such efforts is to develop displays that resemble the superior handling, contrast and flexibility of real paper.
  • As part of this invention we devised an apparatus for tracking interaction techniques for flexible displays that uses a projection apparatus that projects images generated by a computer onto real paper, of which the shape is subsequently measured using a computer vision device. Deformation of the shape of the paper display is then used to manipulate in real time said images and/or associated computer functions displayed on said display. It should be noted that the category of displays to which this invention pertains is very different from the type of rigid-surface LCD displays cited in, for example, U.S. Pat. Nos. 6,567,068 or 6,573,883 which can be rotated around their respective axes but not deformed.
  • Prior art, which include bendable interfaces such as ShapeTape (1) and Gummi (20) demonstrates the value of incorporating the deformation of computing objects for use as input for computer processes. However, in this patent, we propose methods for interacting with flexible displays that rely on deformations of the surface structure of the display itself. While this extends work performed by Schwesig et al (17), which proposed a credit card sized computer that uses physical deformation of the device for browsing of visual information, it should be noted that said device did not incorporate a flexible material, and did not use deformation of the display. Instead, it relied on the use of touch sensors mounted on a rigid LCD-style display body.
  • The use of projection to simulate computer devices on three dimensional objects is also cited in prior art. SmartSkin (18) is an interactive surface that is sensitive to human finger gestures. With SmartSkin, the user can manipulate the contents of a digital back-projection desk using manual interaction. Similarly, Rekimoto's Pick and Drop (16) is a system that lets users drag and drop digital data among different computers by projection onto a physical object. In Ishii's Tangible User Interface (TUI) paradigm (5), interaction with projected digital information is provided through physical manipulation of real-world objects. In all of such systems, the input device is not the actual display itself, or the display is not on the actual input device. With DataTiles (17), Rekimoto et. al. proposed the use of plastic surfaces as widgets that with touch-sensitive control properties for manipulating data projected onto other plastic surfaces. Here, the display surfaces are again two-dimensional and rigid body.
  • In DigitalDesk (24), a physical desk is augmented with electronic input and display. A computer controlled camera and projector are positioned above the desk. Image processing is used to determine which page a user is pointing at. Object character recognition transfers content between real paper and electronic documents projected on the desk. Wellner demonstrates the use of his system with a calculator that blurs the boundaries between the digital and physical world by taking a printed number and transferring it into an electronic calculator. Interactive Paper (11) provides a framework for three prototypes. Ariel (11) merges the use of engineering drawings with electronic information by projecting digital drawings on real paper laid out on a planar surface. In Video Mosaic (11), a paper storyboard is used to edit video segments. Users annotate and organize video clips by spreading augmented paper over a large tabletop. Cameleon (11) simulates the use of paper flight strips by air traffic controllers, merging them with the digital world. Users interact with a tablet and touch sensitive screen to annotate and obtain data from the flight strips. Paper Augmented Digital Documents (3) are digital documents that are modified on a computer screen or on paper. Digital copies of a document are maintained in a central database and if needed, printed to paper using IR transparent ink. This is used to track annotations to documents using a special pen.
  • Insight Lab (9) is an immersive environment that seamlessly supports collaboration and creation of design requirement documents. Paper documents and whiteboards allow group members to sketch, annotate, and share work. The system uses bar code scanners to maintain the link between paper, whiteboard printouts, and digital information.
  • Xlibris (19) uses a tablet display and paper-like interface to include the affordances of paper while reading. Users can read a scanned image of a page and annotate it with digital ink. Annotations are captured and used to organize information. Scrolling has been removed from the system: pages are turned using a pressure sensor on the tablet. Users can also examine a thumbnail overview to select pages. Pages can be navigated by locating similar annotations across multiple documents. Fishkin et al. (2) describe embodied user interfaces that allow users to use physical gestures like page turning, card flipping, and pen annotation for interacting with documents. The system uses physical sensors to recognize these gestures. Due to space limitations we limit our review: other systems exist that link the digital and physical world through paper. Examples include Freestyle (10), Designers' Outpost (8), Collaborage (12), and Xax (6). One feature common to prior work in this area is the restriction of the use of physical paper to a flat surface. Many project onto or sense interaction in a coordinate system based on a rigid 2D surface only. In our system, by contrast, we use as many of the three dimensional affordances of flexible displays as possible.
  • In Illuminating Digital Clay (15), Piper et al. proposed the use of a laser scanner to determine the deformation of a clay mass. This deformation was in turn used to alter images projected upon the clay mass through a projection apparatus. The techniques presented in this patent are different in a number of ways. Firstly, our display unit is completely flexible, can be duplicated to work in unison with other displays of the same type and move freely in three-dimensional space. They can be folded 180 degrees around any axis or sub-axes, and as such completely implement the functionality of two-sided flexible displays. Secondly, rather than determining the overall shape of the object as a point cloud, our input techniques rely on determining the 3D location of specific marker points on the display. We subsequently determine the shape of the display by approximating a Bezier curve with control points that coincide with these marker locations, providing superior resolution. Thirdly, unlike Piper (15), we propose specific interaction techniques based on the 3D manipulation and folding of the display unit. The advantages of regular paper over the windowed display units used in standard desktop computing are manifold (21). In the Myth of the Paperless Office (21) Sellen analyzes the use of physical paper. She proposed a set of design principles for incorporating affordances of paper documents in the design of digital devices, such as 1) Support for Flexible Navigation, 2) Cross Document Use, 3) Annotation While Reading and 4) Interweaving of Reading and Writing.
  • Documents presented on paper can be moved in and out of work contexts with much greater ease than with current displays. Unlike GUI windows or rigid LCD displays, paper can be folded, rotated and stacked along many degrees of freedom (7). It can be annotated, navigated and shared using extremely simple gestural interaction techniques. Paper allows for greater flexibility in the way information is represented and stored, with a richer set of input techniques than currently possible with desktop displays. Conversely, display systems currently support properties unavailable in physical paper, such as easy distribution, archiving, querying and updating of documents. By merging the digital world of computing with the physical world of flexible displays we increase value of both technologies.
  • SUMMARY OF THE INVENTION
  • The present invention relates to a set of interaction techniques for obtaining input to a computer system based on methods and apparatus for detecting properties of the shape, location and orientation of flexible display surfaces, as determined through manual or gestural interactions of a user with said display surfaces. Such input may be used to alter graphical content and functionality displayed on said surfaces or some other display or computing system.
  • One aspect of the invention is a set of interaction techniques for manipulating graphical content and functionality displayed on flexible displays based on methods for detecting the shape, location and orientation of said displays in 3 dimensions and along 6 degrees of freedom, as determined through manual or gestural interaction by a user with said display.
  • Another aspect of the invention is a capture and projection system, used to simulate or otherwise implement a flexible display. Projecting computer graphics onto physical flexible materials allows for a seamless integration between images and multiple 3D surfaces of any shape or form, one that measures and corrects for 3D skew in real time.
  • Another aspect of the invention is the measurement of the deformation, orientation and/or location of flexible display surfaces, for the purpose of using said shape as input to the computer system associated with said display. In one embodiment of the invention, a Vicon Motion Capturing System (23) or equivalent computer vision system is used to measure the location in three dimensional space of retro-reflective markers affixed to or embedded within the surface of the flexible display unit. In another embodiment, movement is tracked through wireless accelerometers embedded into the flexible display surface in lieu of said retro-reflective markers, or deformations are tracked through some fiber optics embedded in the display surface.
  • One embodiment of the invention is the application of said interaction techniques to flexible displays that resemble paper. In another embodiment, the interaction techniques are applied to any form of polymer or organic light emitting diode-based electronic flexible display technology.
  • Another embodiment of the invention is the application of said interaction techniques to flexible displays that mimic or otherwise behave as materials other than paper, including but not limited to textiles whether or not worn on the human body, three-dimensional objects, liquids and the likes.
  • In another embodiment, interaction techniques apply to projection on the skin of live or dead human bodies, the shape of which is sensed via computer vision or embedded accelerometer devices.
  • Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice of the present invention, suitable methods and materials are described below. All publications, patent applications, patents, and other references mentioned herein are expressly incorporated by reference in their entirety. In cases of conflict, the present specification, including definitions, will control.
  • In addition, materials, methods, and examples described herein are illustrative only and are not intended to be limiting.
  • Other features and advantages of the invention will be apparent from and are encompassed by the following detailed description and claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The following Detailed Description, given by way of example, but not intended to limit the invention to specific embodiments described, may be understood in conjunction with the accompanying Figures, incorporated herein by reference, in which:
  • FIG. 1 shows a Hold Gesture with flexible display surface (1). Note that flexible display surfaces and fingers in FIG. 1 through 10 may include some (hidden) marker(s) (3) according to FIG. 11 or FIG. 12 that have not been included in the drawings for reasons of clarity.
  • FIG. 2 shows a Collocate Gesture with flexible display surfaces (1).
  • FIG. 3 shows a Collate Gesture with flexible display surfaces (1).
  • FIG. 4 shows a Flip Gesture, Fold and Half-fold Gestures with flexible display surface (1).
  • FIG. 5 shows a Roll Gesture with flexible display surface (1) with markers (3).
  • FIG. 6 shows a Bend Gesture with flexible display surface (1) and foldline (2).
  • FIG. 7 shows a Rub Gesture with flexible display surface (1).
  • FIG. 8 shows a Staple Gesture with flexible display surface (1).
  • FIG. 9 shows a Pointing Gesture with flexible display surface (1).
  • FIG. 10 shows a Multi-handed Pointing Gesture with flexible display surface (1).
  • FIG. 11 shows a Flexible display surface (1) with markers (3).
  • FIG. 12 shows another embodiment of flexible display surface (1) made of fabric or similar materials with markers (3).
  • FIG. 13 shows a System apparatus for tracking flexible display surface (1) through computer vision cameras emitting infrared light (4) mounted above a workspace with user (7), where markers (3) affixed to flexible display surface (1) reflect infrared light emitted by computer vision cameras (4). Optionally, digital projection system (5) projects images of the modeled flexible display surfaces rendered with textures back onto said flexible display surfaces.
  • DETAILED DESCRIPTION OF THE INVENTION Definitions
  • “Flexible Display” or “Flexible Display Surface” means any display surface made of any material, including, but not limited to displays constituted by projection and including, but not limited to real and electronic paper known in the art, based on Organic Light Emitting Devices or other forms of thin, thin-film or e-ink based technologies such as, e.g., described in U.S. Pat. No. 6,639,578, cardboard, Liquid Crystal Diode(s), Light Emitting Diode(s), Stacked Organic, Transparent Organic or Polymer Light Emitting Device(s) or Diode(s), Optical Fibre(s), Styrofoam, Plastic(s), Epoxy Resin, Textiles, E-textiles, or clothing, skin or body elements of a human or other organism, living or dead, Carbon-based materials, or any other three-dimensional object or model, including but not limited to architectural models, and product packaging. Within the scope of this application, the term is can be interpreted interchangeably as paper, document or paper window, but will not be limited to such interpretation.
  • The term “Paper Window” refers to one embodiment of a flexible display surface implemented by tracking the shape, orientation and location of a sheet of paper, projecting back and image onto said sheet of paper using a projection system, such that it constitutes a flexible electronic display. Within the scope of this application, the term is may be interpreted as interchangeable with flexible display, flexible display surface or document, but the terms flexible display, document and flexible display surface shall not be limited to such interpretation.
  • The term “document” is synonymous for Flexible Display or Flexible Display Surface.
  • “Marker” refers to a device that is affixed to a specific location on a flexible display surface for the purpose of tracking the position or orientation of said location on said surface. Said marker may consist of a small half-sphere made of material that reflects light in the infrared spectrum for the purpose of tracking location with an infrared computer vision camera. Said marker may also consist of an accelerometer that reports to a computer system for the purpose of computing the location of said marker, or any other type of location tracking system known in the art. A similar term used in this context is “point.”
  • “Fold” is synonymous with “Bend,” wherein folding is interpreted to typically be limited to a horizontal or vertical axis of the surface, whereas Bends can occur along any axis (2). Folding does not necessary lead to a crease.
  • Interaction Styles
  • Position and shape of flexible displays can be adjusted for various tasks: these displays can be spread about the desk, organized in stacks, or held close for a detailed view. Direct manipulation takes place with the paper display itself: by selecting and pointing using the fingers, or with a digital pen. The grammar of the interaction styles provided by this invention follows that of natural manipulation of paper and other flexible materials that hold information.
  • FIGS. 1 through 10 show a set of gestures based on deformations and location of the flexible display(s). These gestures provide the basic units of interaction with the system:
  • Hold. Users can hold a flexible display with one or two hands during use. The currently held display is the active document (FIG. 1).
  • Collocate. FIG. 2 shows the use of spatial arrangement of the flexible display(s) for organizing or rearranging information on said display(s). In one embodiment, collocating multiple flexible displays allows image contents to be automatically spread or enlarged across multiple flexible displays that are collocated.
  • Collate. FIG. 3 shows how users may stack flexible displays, organizing said displays in piles on a desk. Such physical organization is reflected in the digital world by semantically associating or otherwise relating computer content of the displays, be it files, web-based or other information, located in a database, on a server, file system or the like, for example, by sorting such computer content according to some property of the physical organization of the displays.
  • Flip or Turn. FIG. 4 shows how users may flip or turn the flexible display by folding it over its x or y axis, thus revealing the other side of the display. Flipping or turning the flexible display around an axis may reveal information that is stored contiguously to the information displayed on the edge of the screen. Note that this flipping or turning gesture is distinct from that of rotating a rigid display surface, in that the folds that occur in the display in the process of turning or flipping the display around its axes are used in detecting said turn or flip. In single page documents, a flip gesture around the x axis may, in a non-limiting example, scroll the associated page content in the direction opposite to that of the gesture. In this case, the flexible display is flipped around the x axis, such that the bottom of the display is lifted up, then folder over to the top. Here, the associated graphical content scrolls down, thus revealing content below what is currently displayed on the display. The opposite gesture, lifting the top of the display, then folding it over to the bottom of the display, causes content to scroll up, revealing information above what is currently displayed. In the embodiment of multi-page documents, flipping gestures around the x-axis may be used by the application to navigate to the prior or next page of said document, pending the directionality of the gesture. In the embodiment of a web browser, said gesture may be used to navigate to the previous or next page of the browsing history, pending the directionality of the gesture.
  • In another embodiment, the flexible display is flipped around the y axis, such that the right hand side of the display is folded up, then over to the left. This may cause content to scroll to the right, revealing information to the right of what is currently on display. The opposite gesture, folding the left side of the display up then over to the right, may cause content to scroll to the left, revealing information to the left of what is currently on display. In the embodiment of multi-page documents, flipping gestures around the y-axis may be used by the application to navigate to the prior or next page of said document, pending the directionality of the gesture. In the embodiment of a web browser, said gesture may be used to navigate to the previous or next page of the browsing history, pending the directionality of the gesture.
  • Fold. Note that wherever the term “Fold” is used it can be substituted for the term “Bend” and vice versa, wherein folding is interpreted to typically be limited to a horizontal or vertical axes of the surface. Where folding a flexible display around either or both its horizontal or vertical axis, either in sequence or simultaneously, serves as a means of input to the software that alters the image content of the document, or affects associated computing functionality (see FIG. 4). As a non-limiting example, this may cause objects displayed in the document to be moved to the center of gravity of the fold, or sorted according to a property displayed in the center of gravity of the fold. As another non-limiting example, following the gravity path of the fold that would exist if water was run through that fold, it may cause objects to be moved from one flexible display to a second flexible display placed underneath it.
  • Half fold. Where partly folding a flexible display on one side or corner of the Document causes a scroll, or the next or previous page in the associated file content to be displayed (FIG. 4).
  • Semi-permanent fold. Where the act of folding a flexible display around either its horizontal or vertical axis, or both, in such way that it remains in a semi-permanent folded state after release, serves as input to a computing system. In a non-limiting example, folding causes any contents associated with flexible displays to be digitally archived. In another non-limiting example, the unfolding of the flexible display causes any contents associated with said flexible display to be un-archived and displayed on said flexible display. In another non-limiting example, said flexible display would reduce its power consumption upon a semi-permanent fold, increasing power consumption upon unfold (FIG. 4).
  • Roll. Where the act of changing the shape of a flexible display such that said shape transitions from planar to cylindrical or vice versa serves as input to a computing system. In a non-limiting example, this causes any contents associated with the flexible display to be digitally archived upon a transition from planar to cylindrical shape (rolling up), and to be un-archived and displayed onto said flexible display upon a transition from cylindrical to planar shape (unrolling). In another non-limiting example, rolling up a display causes it to turn off, while unrolling a display causes it to turn on, or display content (FIG. 5).
  • Bend. Where bending a flexible display around any axes serves as input to a computing system. Bend may produce some visible or invisible fold line (2) that may be used to select information on said display, for example, to determine a column of data properties in a spreadsheet that should be used for sorting. In another non-limiting example, a bending action causes graphical information to be transformed such that it follows the curvature of the flexible display, either in two or three dimensions. The release of a bending action causes the contents associated with the flexible display to be returned to its original shape. Alternatively, deformations obtained through bending may become permanent upon release of the bending action. (See FIG. 6).
  • Rub. The rubbing gesture allows users to transfer content between two or more flexible displays, or between a flexible display and a computing peripheral (see FIG. 7). The rubbing gesture is detected by measuring back and forth motion of the hand on the display, typically horizontally. This gesture is typically interpreted such that information from the top display is transferred, that is either copied or moved, to the display(s) or peripheral(s) directly beneath it. However, if the top display is not associated with any content (i.e., is empty) it becomes the destination and the object directly beneath the display becomes the source of the information transfer. In a non-limiting example, if a flexible display is placed top of a printer peripheral, the rubbing gesture would cause its content to be printed on said printer. In another non-limiting example, when an empty flexible display is rubbed on top of a computer screen, the active window on that screen will be transferred to the flexible display such that it displays on said display. When the flexible display contains content, said content is transferred back to the computer screen instead. In a final non-limiting example, when one flexible display is placed on top of another flexible display the rubbing gesture, applied to the top display, causes information to be copied from the top to the bottom display if the top display holds content, and from the bottom to the top display if the top display is empty. In all examples pertaining to the rubbing gesture, information transfer may be limited to those graphical objects that are currently selected on the source display.
  • Staple. Like a physical staple linking a set of pages, two or more flexible displays may be placed together such that one impacts the second with a detectable force that is over a set threshold (see FIG. 8). This gesture may be used to clone the information associated with the moving flexible display onto the stationary destination document, given that the destination flexible display is empty. If the destination display is not empty, the action shall be identical to that of the collate gesture.
  • Point. Users can point at the content of a paper window using their fingers or a digital pen (see FIG. 9). Fingers and pens are tracked by either computer vision, accelerometers, or some other means. Tapping the flexible display once performs a single click. A double click is issued by tapping the flexible display twice in rapid succession.
  • Two-handed Pointing: Two-handed pointing allows users to select disjoint items on a single flexible display, or across multiple flexible displays that are collocated (see FIG. 10).
  • Interaction Techniques
  • We designed a number of techniques for accomplishing basic tasks using our gesture set, according to the following non-limiting examples:
  • Activate. In GUIs, the active document is selected for editing by clicking on its corresponding window. If only one window is associated with one flexible display, the hold gesture can be used to activate that window, making it the window that receives input operations. The flexible display remains active until another flexible display is picked up and held by the user. Although this technique seems quite natural, it may be problematic when using an input device such as the keyboard. For example, a user may be reading from one flexible display while typing in another flexible display. To address this concern, users can bind their keyboard to the active window using a key.
  • Select. Items on a flexible display can be selected through a one-handed or two-handed pointing gesture. A user opens an item on a page for detailed inspection by pointing at it, and tapping it twice. Two-handed pointing allows parallel use of the hands to select disjoint items on a page. For example, sets of icons can be grouped quickly by placing one finger on the first icon in the set and then tapping one or more icons with the index finger of the other hand. Typically, flexible displays are placed on a flat surface when performing this gesture. Two-handed pointing can also be used to select items using rubber banding techniques. With this technique, any items within the rubber band, bounded by the location of the two finger tips, are selected upon release. Alternatively, objects on a screen can be selected as those located on a foldline or double foldline (2) produced by bends (see FIG. 6).
  • Copy & Paste. In GUIs, copying and pasting of information is typically performed using four discrete steps: (1) specifying the source, (2) issuing the copy, (3) specifying the destination of the paste and (4) issuing the paste. In flexible displays, these actions can be merged into simple rubbing gestures:
  • Transfer to flexible display. Computer windows can be transferred to a flexible display by rubbing a blank flexible display onto the computer screen. The window content is transferred to the flexible display upon peeling the flexible display off the computer screen. The process is reversed when transferring a document displayed on a flexible display back to the computer screen.
  • Copy Between Displays. Users can copy content from one flexible display to the next. This is achieved by placing a flexible display on top of a blank display. The content of the source page is transferred by rubbing it onto the blank display. If prior selections exist on the source page, only highlighted items are transferred. Scroll. Users can scroll through content of a flexible display in discrete units, or pages. Scrolling action is initiated by half-folding, or folding then flipping the flexible displays around its horizontal or vertical axis with a flip or fold gesture. In a non-limiting example, this causes the next page in the associated content to be displayed on the back side of the flexible display. Users can scroll back by reversing the flip.
  • Browse. Flips or folds around the horizontal or vertical axis may also be used to specify back and forward actions that are application specific. For example, when browsing the web, a left flip may cause the previous page to be loaded. To return to the current page, users would issue a right flip. The use of spatially orthogonal flips allows users to scroll and navigate a document independently.
  • Views. The staple gesture can be used to generate parallel copies of a document on multiple flexible displays. Users can open a new view into the same document space by issuing a staple gesture impacting a blank display with a source display. This, for example, allows users to edit disjoint parts of the document simultaneously using two separate flexible displays. Alternatively, users can display multiple pages in a document simultaneously by placing a blank flexible display beside a source flexible display, thus enlarging the view according to the collocate gesture. Rubbing across both displays causes the system to display the next page of the source document onto the blank flexible display that is beside it.
  • Resize/Scale. Documents projected on a flexible display can be scaled using one of two techniques. Firstly, the content of a display can be zoomed within the document. Secondly, users can transfer the source material to a flexible display with a larger size. This is achieved by rubbing the source display onto a larger display. Upon transfer, the content automatically resizes to fit the larger format.
  • Share. Collocated users often share information by emailing or printing out documents. We implemented two ways of sharing: slave and copy. When slaving a document, a user issues a stapling gesture to clone the source onto a blank display. In the second technique, the source is copied to a blank display using the rubbing gesture, then handed to the group member.
  • Open. Users can use flexible displays, or other objects, including computer peripherals such as scanners and copiers as digital stationary. Stationary pages are blank flexible displays that only display a set of application icons. Users can open a new document on the flexible display by tapping an application icon. Users may retrieve content from a scanner or email appliance by rubbing it onto said scanner or appliance. Users may also put the display or associated computing resources in a state of reduced energy use through a roll or semi-permanent fold gesture, where said condition is reversed upon unrolling or unfolding said display.
  • Save. A document is saved by performing the rubbing gesture on a single flexible display, typically while it is placed on a surface.
  • Close. Content displayed on a flexible display may be closed by transferring its contents to a desktop computer using a rubbing gesture. Content may be erased by crumbling or shaking the flexible display.
  • Apparatus of the Invention
  • In one embodiment of the invention, a real piece of flexible, curved or three-dimensional material, such as a cardboard model, piece of paper, textile or human skin may be tracked using computer vision, modeled, texture mapped and then projected back upon the object. Alternatively, the computer vision methods may simply be used to track the shape, orientation and location of a flexible display that does not require the projection component. This in effect implements a projected two-sided flexible display surface that follows the movement, shape and curves of any object in six degrees of freedom. An overview of the elements required for such embodiment of the flexible display (1) is provided in FIGS. 10 and 11. In this non-limiting example, the surface is augmented with infrared (IR) reflective marker dots (3). FIG. 13 shows the elements of the capture and projection system, where the fingers (6) of the user (7) are tracked by affixing three or more IR marker dots to the digit. A digital projection unit (5) allows for projection of the image onto the scene, and a set of infrared or motion capturing cameras (4) allows tracking of the shape orientation and location of the sheets of paper. The following section discusses each of the above apparatus elements, illustrating their relationship to other objects in this embodiment of the system. This example does not withstand other possible embodiments of the apparatus, which include accelerometers embedded in lieu of the marker dots, and mounted on flexible displays. In such embodiment, the wireless accelerometers report acceleration of the marked positions of the material in three dimensions to a host computer so as to determine their absolute or relative location.
  • In one embodiment, the computer vision component uses a Vicon (23) tracker or equivalent computer vision system that can capture three dimensional motion data of retro-reflective markers mounted on the material. Our setup consists of 12 cameras (4) that surround the user's work environment, capturing three dimensional movement of all retro-reflective markers (3) within a workspace of 20′×10′ (see FIG. 13). The system then uses the Vicon data to reconstruct a complete three-dimensional representation that maps the shape, location and orientation of each flexible display surface in the scene.
  • In this embodiment, an initial process of modeling the flexible display is required before obtaining the marker data. First, a Range of Motion (ROM) trial is captured that describes typical movements of the flexible display through the environment. This data is used to reconstruct a three dimensional model that represents the flexible display. Vicon software calibrates the ROM trial to the model and uses it to understand the movements of the flexible display material during a real-time capture, effectively mapping each marker dot on the surface to a corresponding location on the model of the flexible display in memory. To obtain marker data, we modified sample code that is available as part of Vicon's Real Time Development Kit (23).
  • As said, each flexible display surface within the workspace is augmented with IR reflective markers, accelerometers and/or optic fibres that allow shape, deformation, orientation and location of said surface to be computed. In the embodiment of a paper sheet, or paper-shaped flexible display surface, the markers are affixed to form an eight point grid (see FIGS. 10 and 11). In the embodiment where computer vision is used, a graphics engine interfaces with the Vicon server, which streams marker data to our modeling component. In the embodiment where accelerometers are used, coordinates or relative coordinates of the markers are computed from the acceleration of said markers, and provided to our modeling component. The modeling component subsequently constructs a three-dimensional model in OpenGL of each flexible display surface that is tracked by the system. The center point of the flexible display surface is determined by averaging between the markers on said surface. Bezier curve analysis of marker locations is used to construct a fluid model of the flexible display surface shape, where Bezier control points correspond with the location of markers on the display surface. Subsequent analysis of the movement of said surface is used to detect the various gestures.
  • Applications that provide content to the flexible displays run on an associated computer. In cases where the flexible display surface consists of a polymer flexible display capable of displaying data without projection, application windows are simply transferred and displayed on said display. In the case of a projected flexible display, application windows are first rendered off-screen into the OpenGL graphics engine. The graphics engine performs real-time screen captures, and maps a computer image to the three dimensional OpenGL model of the display surface. The digital projector then projects an inverse camera view back onto the flexible display surface. Back projecting the transformed OpenGL model automatically corrects for any skew caused by the shape of the flexible display surface, effective synchronizing the two. The graphics engine similarly models fingers and pens in the environment, posting this information to the off-screen window for processing as cursor movements. Alternatively, input from pens, fingers or other input devices can be obtained through other methods known in the art. In this non-limiting example, fingers (6) of the user (7) are tracked by augmenting them with 3 IR reflective markers (3). Sensors are placed evenly from the tip of the finger up to the base knuckle. Pens are tracked similarly throughout the environment. The intersection of a finger or pen with a flexible display surface is calculated using planar geometry. When the pen or finger is sufficiently close, its tip is projected onto the plane of the flexible display surface. The position of the tip is then related to the length and width of the display. The x and y position of the point on the display (1) is calculated using simple trigonometry. When the pen or finger touches the display, the input device is engaged.
  • Imaging
  • In the embodiment of a projected flexible display, computer images or windows are rendered onto the paper by a digital projector (5) positioned above the workspace. The projector is placed such that it allows a clear line of sight with the flexible display surface between zero and forty-five degrees of visual angle. Using one projector introduces a set of tradeoffs. For example, positioning the projector close to the scene improves the image quality but reduces the overall usable space, and vice versa. Alternatively a set of multiple projectors can be used to render onto the flexible display surface as it travels throughout the environment of the user.
  • Initially, a calibration procedure is required to pair the physical position of the flexible display surface and the digital output of the projector. This is accomplished by adjusting the position, rotation, and size of the projector output until it matches the dimensions of the physical display surface.
  • Gesture Analysis
  • In the following section, the term “marker” is interchangeable with the term “accelerometer”. Understanding the physical motion of paper and other materials in the system requires a combination of approaches. For gestures such as stapling, it is relatively easy to recognize when two flexible displays are rapidly moved towards each other. However, flipping requires knowledge of a flexible display surface's prior state. To recognize this event, the z location of markers at the top and bottom of the page is tracked. During a vertical or horizontal half-rotation, the relative location on the z dimension is exchanged between markers. The movement of the markers is compared to their previous position to determine the direction of the flip, fold or bend.
  • To detect more advanced gestures, like rubbing, marker data is recorded over multiple trials and then isolated in the data. Once located, the gesture is normalized and is used to calculate a distance vector for each component of the fingertip's movement. The system uses this distance vector to establish a confidence value. If this value passes a predetermined threshold the system recognizes the gesture, and if such gesture occurs near the display surface, a rubbing event is issued to the application.
  • EXAMPLES Example 1 Photo Collage
  • There are many usage scenarios that would benefit from the functionality provided by the invention. One such non-limiting example is the selection of photos for printout from a digital photo database containing raw footage. Our design was inspired by the use of contact sheets by professional photographers. Users can compose a photo collage using two flexible displays, selecting a photo on one overview display and then rubbing it onto the second display with a rubbing gesture. This scenario shows the use of flexible display input as a focus and context technique, with one display providing a thumbnail overview of the database, and the other display offering a more detailed view.
  • Users can select thumbnails by pointing at the source page, or by selecting rows through producing a foldline with a bend gesture. By crossing two fold lines, a single photo or object may be selected. Thumbnails that appear rotated can be turned using a simple pivoting action of the index finger. After selection, thumbnails are transferred to the destination page through a rubbing gesture. After the copy, thumbnails may resize to fit the destination page. When done, the content of the destination flexible display can be printed by performing a rubbing gesture onto a printer. The printer location is tracked similarly to that of the flexible display, and is known to the system. Gestures supported by the invention can also be used to edit photos prior to selection. For example, photos are cropped by selecting part of the image with a two-handed gesture, and then rubbing the selection onto a destination flexible display. Photos can be enlarged by rubbing them onto a larger flexible display.
  • Example 2 Flexible Cardboard Game
  • In this non-limiting embodiment, the invention is used to implement a computer game that displays its graphic animations onto physical game board pieces. Said pieces may consist of cardboard that is tracked and projected upon using the apparatus described in this invention, or electronic paper, LCD, e-ink, OLED or other forms of thin, or thin-film displays. The well-known board game Settlers of Catan consists of a game board design in which hexagonal pieces with printed functionality can be placed differently in each game, allowing for a game board that is different each game. Each hexagonal piece, or hex, represents a raw material or good that can be used to build roads or settlements, which is the purpose of the game. In this application, each hex is replaced by a flexible display of the same shape, the position and orientation of which is tracked through the hexes such that a board is formed. A computer algorithm then renders the functionality onto each flexible display hex. This is done through a computer algorithm that calculates and randomizes the board design each time, but within and according to the rules of the game. The graphics on the hexes is animated with computer graphics that track and represent the state of the game. All physical objects in the game are tracked by the apparatus of our invention and can potentially be used as display surfaces. For example, when a user rolls a die, the outcome of said roll is known to the game. Alternatively, the system may roll the die for the user, representing the outcome on a cube-shaped flexible display that represents the cast die. In the game, the number provided by said die indicates the hex that is to produce goods for the users. As an example of an animation presented on a hex during this state of the game, when the hex indicates woodland, a lumberjack may be animated to walk onto the hex to cut a tree, thus providing the wood resource to a user. Similarly, city and road objects may be animated with wagons and humans after they are placed onto the hex board elements. Hex elements that represent ports or seas may be animated with ships that move goods from port to port. Animations may trigger behavior in the game, making the game more challenging. For example, a city or port may explode, requiring the user to take action, such as rebuild the city or port. Or a resource may be depleted, which is represented by a woodland hex slowly turning into a meadow hex, and a meadow hex slowly turning into a desert hex that is unproductive. Climate may be simulated, allowing users to play the game under different seasonal circumstances, thus affecting their constraints. For example, during winters, ports may not be in use. This invention allows the functionality of pc-based or online computer games known in the art, such as Simcity, The Sims, World of Warcraft, or Everquest to be merged with that of physical board game elements.
  • Example 3 3D Flexible Display Objects
  • In this non-limiting embodiment, the invention is used to provide display on any three dimensional object, such that it allows animation or graphics rendering on said three dimensional object. For example, the invention may be used to implement a rapid prototyping environment for the design of electronic appliance user interfaces, such as, for example, but not limited to, the Apple iPod. One element of such embodiment is a three dimensional model of the appliance, made out of card board, Styrofoam, or the like, and either tracked and projected upon using the apparatus of this invention or coated with electronic paper, LCD, e-ink, OLED or other forms of thin, or thin-film displays, such that the shapes and curvatures of the appliance are followed. Another flexible display apparatus described in this invention. Rather than setting up the board according to the rules of the game, users need just lay out the flexible display surface acts as a palette on which user interface elements such as displays and dials are displayed. These user interface elements can be selected and picked up by the user by tapping its corresponding location on the palette display. Subsequent tapping on the appliance model places the selected user interface element onto the appliance's flexible display surface. User interface elements may be connected or associated with each other using a pen or finger gesture on the surface of the model. For example, a dial user interface element may be connected to a movie user interface element on the model, such that said dial, when activated, causes a scroll through said movie. After organizing elements on the surface, subsequent tapping of the user onto the model may actuate functionality of the appliance, for example, a play button may cause the device to produce sound or play a video on its movie user interface element. This allows designers to easily experiment with various interaction styles and layout of interaction elements such as buttons and menus on the appliance design prior to manufacturing. In another embodiment, the above model is a three-dimensional architectural model that represents some building design. Here, each element of the architectural model consists of a flexible display surface. For example, one flexible display surface may be shaped as a wall element, while another flexible display surface may be shaped as a roof element that are physically placed together to form the larger architectural model. Another flexible display surface acts as a palette on which the user can select colors and materials. These can be pasted onto the flexible display elements of the architectural model using any of the discussed interaction techniques. Once pasted, said elements of the architectural model reflect and simulate materials or colors to be used in construction of the real building. As per Example 2, the flexible display architectural model can be animated such that living or physical conditions such as seasons or wear and tear can be simulated. In another embodiment, the flexible display model represents a product packaging. Here, the palette containing various graphical elements that can be placed on the product packaging, for example, to determine the positioning of typographical elements on the product. By extension of this example, product packaging may itself contain or consist of one or multiple flexible display surfaces, such that the product packaging can be animated or used to reflect some computer functionality, including but not limited to online content, messages, RSS feeds, animations, TV shows, newscasts, games and the like. As a non-limiting example, users may tap the surface of a soft drink or food container with an embedded flexible display surface to play a commercial advertisement or TV show on said container, or to check electronic messages. Users may rotate the container to scroll through content on its display, or use a rub gesture to scroll through content. In another embodiment, the product packaging is itself used as a pointing device, that allows users to control a remote computer system.
  • Example 4 Flexible Textile Display
  • In this non-limiting example the flexible display surface consists of electronic textile displays such as but not limited to OLED textile displays known in the art, or white textiles that are tracked and projected upon using the apparatus of this invention. These textile displays may be worn by a human, and may contain interactive elements such as buttons, as per Example 3. In one embodiment of said flexible display fabric, the textile is worn by a human and the display is used by a fashion designer to rapidly prototype the look of various textures, colors or patterns of fabric on the design, in order to select said print for a dress or garment made out of real fabric. In another embodiment, said textures on said flexible textile displays are permanently worn by the user and constitute the garment. Here, said flexible display garment may display messages that are sent to said garment through electronic means by other users, or that represent advertisements and the like.
  • In another embodiment, the flexible textile display is worn by a patient in a hospital, and displays charts and images showing vital statistics, including but not limited to x-ray, ct-scan, or MRI images of said patient. Doctors may interact with user interface elements displayed on said flexible textile display through any of the interaction techniques of this invention and any technique know in prior art. This includes tapping on buttons or menus displayed on said display to select different vital statistics of said patient. In an operating theatre, the flexible textile display is draped on a patient in surgery to show models or images including but not limited to x-ray, ct-scan, MRI or video images of elements inside the patients body to aid surgeons in, for example, pinhole surgery and minimally invasive operations. Images of various regions in the patient's body may be selected by moving the display to that region.
  • Example 4 Flexible Human Display
  • Alternatively, images of vital statistics, x-rays, ct-scans, MRIs, video images and the likes may be projected directly onto a patient to aid or otherwise guide surgery. Here, the human skin itself functions as a display through projection onto said skin, and through tracking the movement and shape of said skin by the apparatus of invention. Such images may contain user interface elements that can be interacted with by a user through techniques of this invention, and those known in the art. For example, tapping a body element may bring up a picture of the most recent x-ray of that element for display, or may be used as a form of input to a computer system.
  • Example 5 Origami Flexible Display
  • In this embodiment, several pieces of flexible display are affixed to one another through a cloth, polymer, metal, plastic or other form of flexible hinge such that the shape of the overall display can be folded in a variety of three dimensional shapes, such as those found in origami paper folding. Folding action may lead to changes on the display or trigger computer functionality. Geometric shapes of the overall display may trigger behaviors or computer functionality.
  • Example 6 Flexible Input Device
  • In this embodiment, the flexible surface with markers is used as input to a computer system that displays on a standard display that is not said flexible surface, allowing use of said flexible surface and the gestures in this invention as an input device to a computing system.
  • The contents of all cited patents, patent applications, and publications are incorporated herein by reference in their entirety. While the invention has been described with respect to illustrative embodiments thereof, it will be understood that various changes may be made in the embodiments without departing from the scope of the invention. Accordingly, the described embodiments are to be considered merely exemplary and the invention is not to be limited thereby.
  • References
    • 1. Balakrishnan, R., G. Fitzmaurice, G. Kurtenbach and Singh, K. Exploring Interactive Curve and Surface Manipulation Using a Bend and Twist Sensitive Input Strip. In Proceedings of the 1999 Symposium on Interactive 3D graphics, ACM Press, 1999, pp. 111-118.
    • 2. Fishkin, K., Gujar, A., Harrison, B., Moran, T. and Want, R. Embodied User Interfaces for Really Direct Manipulation. In Communications of the ACM, v.43 n.9, 2000, pp. 74-80.
    • 3. Guimbretiere, F. Paper Augmented Digital Documents. In Proceedings of UIST 2003. Vancouver: ACM Press, 2003, pp. 51-60.
    • 4. Holman, D., Vertegaal, R., Troje, N. PaperWindows: Interaction Techniques for Digital Paper. In Proceedings of ACM CHI 2005 Conference on Human Factors in Computing Systems. Portland, Oreg.: ACM Press, 2005.
    • 5. Ishii, H. and Ullmer, B. Tangible Bits: Towards Seamless Interfaces Between People, Bits and Atoms. In Proceedings of CHI 1997. Atlanta: ACM, 1997, pp. 234-241.
    • 6. Johnson, W., Jellinek, H., Klotz, L., Rao, R. and Card S. Bridging the Paper and Electronic Worlds: The Paper User Interface. In Proceedings of the INTERCHI 1993. Amsterdam: ACM Press, 1993, pp. 507-512.
    • 7. Ju, W. Bonanni, L., Fletcher, R., et al. Origami Desk: Integrating Technological Innovation and Human-centric Design. In Proceedings of DIS 2002. London: ACM Press, 2002, pp. 399-405.
    • 8. Klemmer, S., Newman, M., Farrell, R., Bilezikjian, M. and Landay, J. The Designers' Outpost: A Tangible Interface for Collaborative Web Site Design. In Proc. of UIST 2001. Orlando: ACM Press, 2001, pp. 1-10.
    • 9. Lange, B., Jones, M., and Meyers, J. Insight Lab: An Immersive Team Environment Linking Paper Displays and Data. In Proceedings of CHI 1998. Los Angeles: ACM Press, 1998, pp. 550-557.
    • 10. Levine, S. R. and S. F. Ehrlich. The Freestyle System: A Design Perspective. In Human-Machine Interactive Systems, A. Klinger, Editor, 1991, pp. 3-21.
    • 11. Mackay, W. E. & Fayard, A- L. Designing Interactive Paper: Lessons from Three Augmented Reality Projects. In Proceedings of IWAR '98, International Workshop on Augmented Reality. Natick, M A: A K Peters, Ltd., 1998.
    • 12. Moran, T., Saund, E., Van Melle, W., Gujar, A., Fishkin, K. and Harrison, B. Design and Technology for Collaborage: Collaborative Collages of Information on Physical Walls. In Proceedings of UIST 1999. Asheville, N.C.: ACM Press, 1999, pp. 197-206.
    • 13. O'Hara, K. and Sellen, A. A Comparison of Reading Paper and On-line Documents. In Proceedings of CHI 1997. Atlanta: ACM Press, 1997, pp. 335-342.
    • 14. Philips OLED Technology. http://www.business-sites.philips.com/mds/section-1131/
    • 15. Piper, B., Ratti, C. and H. Ishii. Illuminating Clay: A 3-D Tangible Interface for Landscape Analysis In Proceedings of CHI 2002. Minneapolis: ACM Press, 2002.
    • 16. Rekimoto, J. Pick-and-Drop: A Direct Manipulation Technique for Multiple Computer Environments. In Proceedings of UIST 1997. Banff: ACM Press, 1997, pp. 31-39.
    • 17. Rekimoto, J. Ullmer, B. and H. Oba, DataTiles: A Modular Platform for Mixed Physical and Graphical Interactions. In Proceedings of CHI 2001. Seattle: ACM Press, 2001.
    • 18. Rekimoto, J. SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces. In Proceedings of CHI 2002. Minneapolis: ACM Press, 2002, pp. 113-120.
    • 19. Schilit, B., Golovchinsky, G., and Price, M. Beyond Paper: Supporting Active Reading with Free Form Digital Ink Annotations. In Proceedings of CHI 1998. Los Angeles: ACM Press, 1998, pp. 249-256.
    • 20. Schwesig, C., Poupyrev, I., and Mori, E. Gummi: A Bendable Computer. In Proceedings of CHI 2004. Vienna: ACM Press, 2003, pp. 263-270.
    • 21. Sellen, A., and Harper, R. The Myth of the Paperless Office, MIT Press, Cambridge, Mass., 2003.
    • 22. Sun Starfire: A Video of Future Computing. http://www.asktog.com/starfire/starfirescript.html.
    • 23. Vicon. http.//www.vicon.com
    • 24. Weiser, M. The Computer for the 21st Century. Scientific American, 1991, 265 (3), pp. 94-104.
    • 25. Wellner, P. The DigitalDesk Calculator: Tangible Manipulation on a Desk Top Display. In Proceedings of UIST 1991. Hilton Head: ACM Press, 1991, pp. 27-33.

Claims (21)

1. A method for capturing location, orientation and shape of one or more flexible display surface(s) comprising the steps of:
a) Determining the location in three dimensions of one or more Points within said flexible display surface(s);
b) Calculating a three dimensional model of the shape, orientation and location of said flexible display surface(s);
c) Clustering locations of Points and fitting curves through said measured locations of Points to determine the three dimensional model; and
d) Optionally determining the relative locations of Points such that the state of the shape or deformation of said flexible display surface(s) can be recognized.
2. The method of claim 1 wherein the flexible display surface is a three dimensional surface made of a material, selected from a group consisting of: paper, cardboard, paper-like materials, electronic paper, thin substrate displays, thin-film substrate displays, flexible substrate displays, liquid crystal devices, liquid crystal diodes, light emitting devices, light emitting diodes, organic light emitting devices, stacked organic light emitting devices, transparent organic light emitting devices, polymer light emitting devices, organic light emitting diodes, stacked organic light emitting diodes, transparent organic light emitting diodes, polymer light emitting diodes, optical fibres, styrofoam, plastics, epoxy resin, textiles, e-textiles, clothing, skin of a living or dead human or other organism, body of a living or dead human or other organism, carbon-based materials and any three-dimensional object or model.
3. The method of claim 1 wherein a Point is a light reflective marker, embedded or otherwise affixed to said flexible display surface, and where the device for capturing three-dimensional location is an active or passive computer vision system that comprises one or more cameras.
4. The method of claim 1 wherein a Point is an accelerometer embedded or otherwise affixed to said flexible display surface, and where acceleration of said accelerometer is used to calculate the three dimensional position or velocity of said Point.
5. The method of claim 1 wherein Points are inferred from properties of said flexible display surface, as extracted from a background by a computer vision algorithm using properties of said flexible display surface that include shape, color, image or brightness.
6. The method of claim 1 wherein deformation of said flexible display surface is determined by measuring the intensity of light passing through one or more optical fiber mounted along said flexible display surface.
7. A method for capturing the location in three dimensions of the finger(s) of one or multiple hands or some tool held by one or multiple hand(s) for the purpose of determining location of said finger(s) or said tool or said hand(s) within a flexible display surface, comprising the steps of:
a) Measuring the location in three dimensions of one or more Point(s) located on said fingers or stylus; and
b) Relating said location of said Points to a coordinate system defined by said flexible display surface so as to obtain a position relative to said coordinate system.
8. The method of claim 7 wherein a Point is a light reflective marker, embedded or otherwise affixed to said flexible display surface, and where the device for capturing three-dimensional location is an active or passive computer vision system that comprises one or more cameras.
9. The method of claim 8 where the marker is selected from a group consisting of: infrared reflective semisphere or sphere, infrared reflective pattern or object, sphere, semisphere or pattern reflecting specific color(s) in the visible light spectrum, and infrared reflective ink pattern.
10. The method of claim 7 wherein a Point is an accelerometer embedded or otherwise affixed to said flexible display surface, and where acceleration of said accelerometer is used to calculate the three dimensional position or velocity of said Point.
11. The method of claim 7 wherein the location of fingers or tools are sensed through other means known in the art, including but not limited to touch screen, capacitive sensors, electromagnetic field tracking or other forms of computer vision.
12. A method for projecting Image(s) onto a surface corrected for shape, orientation and location of said surface through a model obtained by methods of claim 1 and 2, onto said surface(s), using a series of projector(s) mounted such that they project upon the flexible display surface(s), and cover the space through which said flexible display surface(s) may move.
13. The method of claim 12 wherein the Image is a three dimensional model consisting of the shape and/or location and/or orientation of said surface, and wherein said three dimensional model is texture-mapped with a second Image selected from a group consisting of: the contents of a computer window, the contents of a computer file or document, any other static electronic image(s)., and any moving electronic images.
14. The method of claim 12 wherein the surface is a three dimensional surface made of a material selected from a group consisting of: paper, cardboard, paper-like materials, electronic paper, thin substrate displays, thin-film substrate displays, flexible substrate displays, liquid crystal devices, liquid crystal diodes, light emitting devices, light emitting diodes, organic light emitting devices, stacked organic light emitting devices, transparent organic light emitting devices, polymer light emitting devices, organic light emitting diodes, stacked organic light emitting diodes, transparent organic light emitting diodes, polymer light emitting diodes, optical fibres, styrofoam, plastics, epoxy resin, textiles, e-textiles, clothing, skin of a living or dead human or other organism, body of a living or dead human or other organism, carbon-based materials and any three-dimensional object or model.
15. A method for providing input to a computer system that uses properties of shape, orientation and/or location of one or more flexible surface(s) associated with said computer system, or deformation of said properties, wherein said properties are selected from a group consisting of:
a) Hold, wherein a single flexible surface is activated as a destination of computer commands, or activates associated computing commands, by holding it with one or two hands, and where said surface remains the active surface until another such surface is activated.
b) Collocate, wherein collocating multiple flexible surfaces is used to create a larger flexible surface, which act serves as input to a computer system.
c) Collate, wherein multiple flexible surfaces are organized by stacking them on top of one another, and where such organization is used as input to a computer system.
d) Flip or Turn, wherein rotating a flexible surface around its horizontal or vertical axes such that one of the extremities of the surfaces is lifted up, then folded over, is used as input to a computer system.
e) Fold, wherein folding a flexible surface around any of its axes serves as a means of input to a computer system.
f) Part-fold, wherein partly folding a flexible surface on serves as input to a computer system.
g) Semi-permanent fold, wherein the act or shape resulting from folding a flexible surface around any of its axes in such way that it remains in a folded state after release, serves as input to a computing system.
h) Roll, wherein the act of changing the shape of a flexible surface such that said shape transitions from planar to cylindrical or vice versa serves as input to a computing system.
i) Bend, wherein bending a flexible surface around any of its axes serves as input to a computing system.
j) Rub, wherein providing a rubbing gesture in which the hand or finger or some tool is moved back and forth over a flexible surface is used as input to a computing system.
k) Staple, wherein the act of impacting a first flexible surface with a second flexible surface serves as input to a computing system; and
l) Pointing, wherein the location of such hand(s), tool or finger(s) serve as input to a computing system.
16. The method of claim 15 wherein property j is applied to a first flexible surface collated on top of a second flexible surface, and wherein said property causes the content of said first flexible surface to be copied or otherwise moved onto said second flexible display surface.
17. The method of claim 16 wherein said second flexible surface is a traditional or rigid computer display terminal, and wherein said content is moved from said second flexible surface to said first flexible surface if said first flexible surface does not display an image, or vice versa when said first flexible surface does display an image.
18. The method of claim 16 wherein said second flexible surface is any computing peripheral that has a processing action and corresponding software associated, and wherein content is moved from said first flexible surface to said computing peripheral for processing.
19. The method of claim 18 wherein said computing peripheral is a printer or network peripheral, and wherein said content is moved to said printer or network peripheral for printing, or to a remote location for printing or viewing on a computing system.
20. The method of claim 15 wherein said input to a computing system causes a command to execute on said computing system and wherein said command is selected from a group consisting of:
a) Activate, wherein a file or computer content, image, selection, or window associated with or displaying on said flexible surface is selected for other commands, such as but not limited to editing commands.
b) Zoom in or Enlarge, wherein an image or content of a file associated with said flexible surface is enlarged or zoomed in on.
c) Zoom out or Reduce, wherein an image associated with said flexible surface is reduced or zoomed out of.
d) Organize, wherein some property of file(s), digital information, text, images, or other computer content associated with or displaying on said flexible surface(s) is organized or sorted digitally in a way that matches properties of the physical organization of said flexible surface(s), such as, but not limited to, their physical order.
e) Scroll, wherein an image or content of a file associated with said flexible surface is scrolled, such that a portion of said image, or content of said file is revealed that is currently not rendered, or that is contiguous to what is currently rendered on said flexible surface, or some other display.
f) Page Down, wherein a section of the content of a file that is subsequent to the section of said content that is currently displayed on or associated with said flexible surface, or some other display, is navigated to such that it causes said subsequent section to render on said flexible surface or display.
g) Page Up, wherein a section of the content of a file that precedes the section of said content that is currently displayed on or associated with said flexible surface, or some other display, is navigated to such that it causes said subsequent section to render on said flexible surface or display.
h) Navigate, wherein an arbitrary section of the content of a file associated with the flexible surface, or some online content or hyperlink associated with said surface is navigated to such that it causes said content to render on said flexible surface or some other display.
i) Page Back or Forward, wherein a section of the content of a file, or some online content, webpage or hyperlink that precedes or follows the content currently displayed or associated with the flexible surface, is navigated to such that it causes said content to render on said flexible surface or some other display.
j) Open or Close, wherein some file or digital information associated with said flexible display is opened or closed, read into memory, or out to a permanent medium.
k) Sleep or Wake, wherein said flexible display surface is de-activated, or activated from a state of reduced energy use.
l) Deformation, wherein the shape of a three dimensional model associated with said flexible surface is deformed in such way that it follows the deformation of said flexible surface, in any dimension.
m) Save, wherein the file associated with the flexible display is saved to a hard drive or other permanent medium.
n) Move or Copy, wherein a section of the content of a file, or other digital information, or some selection thereof, currently associated with said flexible surface is transferred to another flexible surface or computing device.
o) Duplicate, wherein the information or file associated with one flexible surface is made identical to that of a second flexible surface so as to clone or duplicate said information on said second flexible surface.
p) Select, where items or images, or both, displayed within a flexible display surface are selected; and
q) Stationary, wherein said flexible surface or some other display shows a set of icons indicating a set of computing applications, or potential functionality of said flexible surface or display and wherein said computing system refrains to context of interaction of said computing system to said application functionality, after selecting said icon of said application.
21. The method of claim 15 wherein the flexible surface is a flexible display or flexible display surface.
US11/731,447 2006-03-30 2007-03-30 Interaction techniques for flexible displays Abandoned US20070247422A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US11/731,447 US20070247422A1 (en) 2006-03-30 2007-03-30 Interaction techniques for flexible displays
US12/459,973 US20100045705A1 (en) 2006-03-30 2009-07-10 Interaction techniques for flexible displays
US13/228,681 US8466873B2 (en) 2006-03-30 2011-09-09 Interaction techniques for flexible displays
US13/589,732 US20130127748A1 (en) 2006-03-30 2012-08-20 Interaction techniques for flexible displays
US13/919,046 US20140085184A1 (en) 2006-03-30 2013-06-17 Interaction Techniques for Flexible Displays
US14/314,589 US20150309611A1 (en) 2006-03-30 2014-06-25 Interaction techniques for flexible displays
US15/293,419 US20170224140A1 (en) 2006-03-30 2016-10-14 Interaction techniques for flexible displays

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US78840506P 2006-03-30 2006-03-30
US11/731,447 US20070247422A1 (en) 2006-03-30 2007-03-30 Interaction techniques for flexible displays

Related Child Applications (3)

Application Number Title Priority Date Filing Date
US12/459,973 Continuation-In-Part US20100045705A1 (en) 2006-03-30 2009-07-10 Interaction techniques for flexible displays
US12/459,973 Continuation US20100045705A1 (en) 2006-03-30 2009-07-10 Interaction techniques for flexible displays
US13/228,681 Continuation US8466873B2 (en) 2006-03-30 2011-09-09 Interaction techniques for flexible displays

Publications (1)

Publication Number Publication Date
US20070247422A1 true US20070247422A1 (en) 2007-10-25

Family

ID=38619042

Family Applications (3)

Application Number Title Priority Date Filing Date
US11/731,447 Abandoned US20070247422A1 (en) 2006-03-30 2007-03-30 Interaction techniques for flexible displays
US13/228,681 Active US8466873B2 (en) 2006-03-30 2011-09-09 Interaction techniques for flexible displays
US13/919,046 Abandoned US20140085184A1 (en) 2006-03-30 2013-06-17 Interaction Techniques for Flexible Displays

Family Applications After (2)

Application Number Title Priority Date Filing Date
US13/228,681 Active US8466873B2 (en) 2006-03-30 2011-09-09 Interaction techniques for flexible displays
US13/919,046 Abandoned US20140085184A1 (en) 2006-03-30 2013-06-17 Interaction Techniques for Flexible Displays

Country Status (1)

Country Link
US (3) US20070247422A1 (en)

Cited By (163)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070082378A1 (en) * 2005-10-07 2007-04-12 University Of Chicago Convergent synthesis of proteins by kinetically controlled ligation
US20070101268A1 (en) * 2005-11-01 2007-05-03 Microsoft Corporation Video booklet
US20070146243A1 (en) * 2005-12-27 2007-06-28 Lite-On Technology Corporation Variable-sized screen
US20080034365A1 (en) * 2006-08-07 2008-02-07 Bea Systems, Inc. System and method for providing hardware virtualization in a virtual machine environment
US20080158239A1 (en) * 2006-12-29 2008-07-03 X-Rite, Incorporated Surface appearance simulation
US20080288579A1 (en) * 2007-05-17 2008-11-20 Bea Systems, Inc. Ubiquitous Content Subscription and Delivery via a Smart Electronic Paper Device
US20090193348A1 (en) * 2008-01-30 2009-07-30 Microsoft Corporation Controlling an Integrated Messaging System Using Gestures
US20090271691A1 (en) * 2008-04-25 2009-10-29 Microsoft Corporation Linking digital and paper documents
US20090281629A1 (en) * 2008-05-05 2009-11-12 Christian Roebling Intervertebral disc prosthesis
WO2010004080A1 (en) * 2008-07-10 2010-01-14 Nokia Corporation User interface, device and method for a physically flexible device
US20100026642A1 (en) * 2008-07-31 2010-02-04 Samsung Electronics Co., Ltd. User interface apparatus and method using pattern recognition in handy terminal
US20100026719A1 (en) * 2008-07-31 2010-02-04 Sony Corporation Information processing apparatus, method, and program
US20100029335A1 (en) * 2008-08-04 2010-02-04 Harry Vartanian Apparatus and method for communicating multimedia documents or content over a wireless network to a digital periodical or advertising device
US20100045705A1 (en) * 2006-03-30 2010-02-25 Roel Vertegaal Interaction techniques for flexible displays
US20100053174A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible interface e-paper conformation
US20100053071A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible display containing electronic device conformation
US20100053074A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control based on bendable display containing electronic device conformation sequence status
US20100053068A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable electronic device status information system and method
US20100053073A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control based on bendable display containing electronic device conformation sequence status
US20100056214A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Application control based on flexible interface conformation sequence status
US20100051680A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Application control based on flexible electronic device conformation sequence status
US20100053207A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Application control based on flexible electronic device conformation sequence status
US20100053067A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on e-paper conformation
US20100056223A1 (en) * 2008-09-02 2010-03-04 Choi Kil Soo Mobile terminal equipped with flexible display and controlling method thereof
US20100053217A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on E-paper conformation
US20100053075A1 (en) * 2008-08-29 2010-03-04 Searete Llc Display control based on bendable interface containing electronic device conformation sequence status
US20100053076A1 (en) * 2008-08-29 2010-03-04 Searete Llc Display control based on bendable interface containing electronic device conformation sequence status
US20100053122A1 (en) * 2008-08-29 2010-03-04 Searete Llc., A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible interface E-paper conformation
US20100053173A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible display containing electronic device conformation
US20100060565A1 (en) * 2008-08-29 2010-03-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on e-paper conformation
US20100060564A1 (en) * 2008-09-11 2010-03-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on e-paper conformation
US20100073278A1 (en) * 2008-08-29 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper application control based on conformation sequence status
US20100073263A1 (en) * 2008-09-22 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware, E-Paper application control based on conformation sequence status
US20100073333A1 (en) * 2008-09-22 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper application control based on conformation sequence status
US20100073334A1 (en) * 2008-09-25 2010-03-25 Cohen Alexander J E-paper application control based on conformation sequence status
US20100079369A1 (en) * 2008-09-30 2010-04-01 Microsoft Corporation Using Physical Objects in Conjunction with an Interactive Surface
US20100085300A1 (en) * 2008-08-29 2010-04-08 Cohen Alexander J Bendable electronic interface external control system and method
US20100085301A1 (en) * 2008-08-29 2010-04-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable electronic interface external control system and method
US20100085298A1 (en) * 2008-10-07 2010-04-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US20100085277A1 (en) * 2008-10-07 2010-04-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US20100091008A1 (en) * 2008-08-29 2010-04-15 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US20100090991A1 (en) * 2008-10-10 2010-04-15 Searete Llc, A Limited Liability Corporation Of The State Of Delaware. E-Paper display control based on conformation sequence status
WO2010041227A1 (en) * 2008-10-12 2010-04-15 Barit, Efrat Flexible devices and related methods of use
US20100100888A1 (en) * 2004-10-05 2010-04-22 Azul Systems, Inc. Resource allocation
US20100103123A1 (en) * 2008-08-29 2010-04-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable electronic device status information system and method
US20100117954A1 (en) * 2008-11-07 2010-05-13 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US20100117955A1 (en) * 2008-08-29 2010-05-13 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US20100123689A1 (en) * 2008-11-14 2010-05-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper external control system and method
US20100164888A1 (en) * 2008-12-26 2010-07-01 Sony Corporation Display device
US20100214323A1 (en) * 2009-02-23 2010-08-26 Canon Kabushiki Kaisha Image processing system, image processing apparatus, display apparatus, method of controlling the same, and program
US20100229129A1 (en) * 2009-03-04 2010-09-09 Microsoft Corporation Creating organizational containers on a graphical user interface
US20100238114A1 (en) * 2009-03-18 2010-09-23 Harry Vartanian Apparatus and method for providing an elevated, indented, or texturized display device
US20100283743A1 (en) * 2009-05-07 2010-11-11 Microsoft Corporation Changing of list views on mobile device
US20100328267A1 (en) * 2009-06-30 2010-12-30 Hon Hai Precision Industry Co., Ltd. Optical touch device
US20110037742A1 (en) * 2009-08-13 2011-02-17 University-Industry Cooperation Group Of Kyung Hee University Cooperative multi-display
US20110047460A1 (en) * 2009-08-19 2011-02-24 Samsung Electronics Co., Ltd. Method and apparatus of electronic paper comprising a user interface
US20110054909A1 (en) * 2008-05-08 2011-03-03 Koninklijke Philips Electronics N.V. Localizing the position of a source of a voice signal
US20110055049A1 (en) * 2009-08-28 2011-03-03 Home Depot U.S.A., Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US20110128238A1 (en) * 2009-11-27 2011-06-02 Lg Electronics Inc. Electric device and control method thereof
US20110161160A1 (en) * 2009-12-30 2011-06-30 Clear Channel Management Services, Inc. System and method for monitoring audience in response to signage
US20110199342A1 (en) * 2010-02-16 2011-08-18 Harry Vartanian Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound
US20110216091A1 (en) * 2010-03-04 2011-09-08 Song Hyunyoung Bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
US8174931B2 (en) 2010-10-08 2012-05-08 HJ Laboratories, LLC Apparatus and method for providing indoor location, position, or tracking of a mobile computer using building information
US20120159373A1 (en) * 2010-12-15 2012-06-21 Verizon Patent And Licensing, Inc. System for and method of generating dog ear bookmarks on a touch screen device
US20120188153A1 (en) * 2011-01-21 2012-07-26 Research In Motion Corporation Multi-bend display activation adaptation
US8262236B2 (en) 2008-06-17 2012-09-11 The Invention Science Fund I, Llc Systems and methods for transmitting information associated with change of a projection surface
US20120229440A1 (en) * 2008-06-05 2012-09-13 Bindu Rama Rao E-paper based digital document display device that retrieves updates autmatically
US8267526B2 (en) 2008-06-17 2012-09-18 The Invention Science Fund I, Llc Methods associated with receiving and transmitting information related to projection
US8289352B2 (en) 2010-07-15 2012-10-16 HJ Laboratories, LLC Providing erasable printing with nanoparticles
US8297495B2 (en) 2008-08-29 2012-10-30 The Invention Science Fund I, Llc Application control based on flexible interface conformation sequence status
US8308304B2 (en) 2008-06-17 2012-11-13 The Invention Science Fund I, Llc Systems associated with receiving and transmitting information related to projection
US8376558B2 (en) 2008-06-17 2013-02-19 The Invention Science Fund I, Llc Systems and methods for projecting in response to position change of a projection surface
US8384005B2 (en) 2008-06-17 2013-02-26 The Invention Science Fund I, Llc Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface
US20130085849A1 (en) * 2011-09-30 2013-04-04 Matthew G. Dyor Presenting opportunities for commercialization in a gesture-based user interface
CN103176735A (en) * 2011-12-23 2013-06-26 三星电子株式会社 Method and apparatus for controlling flexible display in portable terminal
US20130201115A1 (en) * 2012-02-08 2013-08-08 Immersion Corporation Method and apparatus for haptic flex gesturing
US20130201093A1 (en) * 2012-02-06 2013-08-08 Yongsin Kim Portable device and method for controlling the same
US20130215088A1 (en) * 2012-02-17 2013-08-22 Howon SON Electronic device including flexible display
US20130217496A1 (en) * 2012-02-20 2013-08-22 Jake Waldron Olkin Dynamic Game System And Associated Methods
US20130222222A1 (en) * 2012-02-24 2013-08-29 Nokia Corporation Method and apparatus for presenting multi-dimensional representations of an image dependent upon the shape of a display
WO2013136204A1 (en) * 2012-02-24 2013-09-19 Nokia Corporation A method, apparatus and computer program for displaying content on a foldable display
US20130300732A1 (en) * 2012-05-11 2013-11-14 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US8602564B2 (en) 2008-06-17 2013-12-10 The Invention Science Fund I, Llc Methods and systems for projecting in response to position
US8608321B2 (en) 2008-06-17 2013-12-17 The Invention Science Fund I, Llc Systems and methods for projecting in response to conformation
US20140002419A1 (en) * 2012-06-28 2014-01-02 Motorola Mobility Llc Systems and Methods for Processing Content Displayed on a Flexible Display
US8641203B2 (en) 2008-06-17 2014-02-04 The Invention Science Fund I, Llc Methods and systems for receiving and transmitting signals between server and projector apparatuses
US20140035872A1 (en) * 2008-10-24 2014-02-06 Samsung Electronics Co., Ltd. Input device for foldable display device and input method thereof
US8646689B2 (en) 2007-12-28 2014-02-11 Cognex Corporation Deformable light pattern for machine vision system
US8665236B2 (en) 2011-09-26 2014-03-04 Apple Inc. Electronic device with wrap around display
US20140111629A1 (en) * 2012-10-20 2014-04-24 Margaret Morris System for dynamic projection of media
US8723787B2 (en) 2008-06-17 2014-05-13 The Invention Science Fund I, Llc Methods and systems related to an image capture projection surface
US8733952B2 (en) 2008-06-17 2014-05-27 The Invention Science Fund I, Llc Methods and systems for coordinated use of two or more user responsive projectors
US8743244B2 (en) 2011-03-21 2014-06-03 HJ Laboratories, LLC Providing augmented reality based on third party information
US20140195898A1 (en) * 2013-01-04 2014-07-10 Roel Vertegaal Computing Apparatus
EP2763018A1 (en) * 2013-02-01 2014-08-06 Samsung Display Co., Ltd. Stretchable display and method of controlling the same
US8803060B2 (en) 2009-01-12 2014-08-12 Cognex Corporation Modular focus system alignment for image based readers
US8816977B2 (en) 2011-03-21 2014-08-26 Apple Inc. Electronic devices with flexible displays
US8820939B2 (en) 2008-06-17 2014-09-02 The Invention Science Fund I, Llc Projection associated methods and systems
US8857999B2 (en) 2008-06-17 2014-10-14 The Invention Science Fund I, Llc Projection in response to conformation
EP2787425A3 (en) * 2013-04-02 2014-12-17 Samsung Display Co., Ltd. Optical detection of bending motions of a flexible display
US8929085B2 (en) 2011-09-30 2015-01-06 Apple Inc. Flexible electronic devices
US8934228B2 (en) 2011-03-21 2015-01-13 Apple Inc. Display-based speaker structures for electronic devices
US8936367B2 (en) 2008-06-17 2015-01-20 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US20150033193A1 (en) * 2013-07-25 2015-01-29 Here Global B.V. Methods for modifying images and related aspects
US8944608B2 (en) 2008-06-17 2015-02-03 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8966681B2 (en) 2013-02-26 2015-03-03 Linda L. Burch Exercise mat
US8988381B1 (en) * 2014-02-14 2015-03-24 Lg Electronics Inc. Mobile terminal and method for controlling the same
US9007300B2 (en) 2011-10-14 2015-04-14 Blackberry Limited Method and system to control a process with bend movements
US20150113480A1 (en) * 2012-06-27 2015-04-23 Oce-Technologies B.V. User interaction system for displaying digital objects
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
US9178970B2 (en) 2011-03-21 2015-11-03 Apple Inc. Electronic devices with convex displays
US9218526B2 (en) 2012-05-24 2015-12-22 HJ Laboratories, LLC Apparatus and method to detect a paper document using one or more sensors
US9239647B2 (en) * 2012-08-20 2016-01-19 Samsung Electronics Co., Ltd Electronic device and method for changing an object according to a bending state
US20160048170A1 (en) * 2014-08-13 2016-02-18 Samsung Electronics Co., Ltd. Method and electronic device for processing image
US20160098132A1 (en) * 2014-10-07 2016-04-07 Samsung Electronics Co., Ltd. Electronic device including flexible display
US20160109985A1 (en) * 2013-11-14 2016-04-21 Nokia Technologies Oy Flexible device deformation measurement
US9329469B2 (en) 2011-02-17 2016-05-03 Microsoft Technology Licensing, Llc Providing an interactive experience using a 3D depth camera and a 3D projector
US20160139674A1 (en) * 2014-11-19 2016-05-19 Kabushiki Kaisha Toshiba Information processing device, image projection device, and information processing method
WO2016078266A1 (en) * 2014-11-18 2016-05-26 中兴通讯股份有限公司 Method and device for capturing image and storage medium
US9373123B2 (en) 2009-12-30 2016-06-21 Iheartmedia Management Services, Inc. Wearable advertising ratings methods and systems
US9389776B2 (en) 2012-12-20 2016-07-12 Samsung Display Co., Ltd. Switching complex, flexible display apparatus having the same and method of generating input signal using the same
US9400576B2 (en) 2011-07-19 2016-07-26 Apple Inc. Touch sensor arrangements for organic light-emitting diode displays
US9418479B1 (en) * 2010-12-23 2016-08-16 Amazon Technologies, Inc. Quasi-virtual objects in an augmented reality environment
US9480907B2 (en) 2011-03-02 2016-11-01 Microsoft Technology Licensing, Llc Immersive display with peripheral illusions
US20160339337A1 (en) * 2015-05-21 2016-11-24 Castar, Inc. Retroreflective surface with integrated fiducial markers for an augmented reality system
US9509981B2 (en) 2010-02-23 2016-11-29 Microsoft Technology Licensing, Llc Projectors and depth cameras for deviceless augmented reality and interaction
US9557862B2 (en) 2013-12-17 2017-01-31 Industrial Technology Research Institute Bend sensor, bend sensing method and bend sensing system for flexible display panel
US9566404B2 (en) 2014-03-05 2017-02-14 General Electric Company Medical vaporizer
US20170068277A1 (en) * 2008-08-29 2017-03-09 Searete Llc Bendable Electronic Device Status Information System and Method
US9597587B2 (en) 2011-06-08 2017-03-21 Microsoft Technology Licensing, Llc Locational node device
US9608216B2 (en) * 2013-05-30 2017-03-28 Samsung Display Co., Ltd. Flexible display device and method of manufacturing the same
CN106713747A (en) * 2016-11-29 2017-05-24 维沃移动通信有限公司 Focusing method and mobile terminal
US20170212602A1 (en) * 2014-07-31 2017-07-27 Hewlett-Packard Development Company, L.P Virtual reality clamshell computing device
US20170285849A1 (en) * 2014-09-05 2017-10-05 Samsung Electronics Co., Ltd. Touch screen panel, electronic notebook, and mobile terminal
US9823696B2 (en) 2012-04-27 2017-11-21 Nokia Technologies Oy Limiting movement
US9823707B2 (en) 2012-01-25 2017-11-21 Nokia Technologies Oy Contortion of an electronic apparatus
US20170351384A1 (en) * 2016-06-02 2017-12-07 Coretronic Corporation Touch display system and correciton method thereof
WO2017213969A1 (en) * 2016-06-09 2017-12-14 Microsoft Technology Licensing, Llc Modular extension of inertial controller for six dof mixed reality input
US20170357333A1 (en) 2016-06-09 2017-12-14 Alexandru Octavian Balan Passive optical and inertial tracking in slim form-factor
US9866660B2 (en) 2011-03-21 2018-01-09 Apple Inc. Electronic devices with concave displays
US9952706B2 (en) * 2012-07-30 2018-04-24 Samsung Electronics Co., Ltd. Flexible device for providing bending interaction guide and control method thereof
US9983729B2 (en) 2010-05-21 2018-05-29 Nokia Technologies Oy Method, an apparatus and a computer program for controlling an output from a display of an apparatus
US20180150110A1 (en) * 2016-11-25 2018-05-31 Fuji Xerox Co., Ltd. Display apparatus, image processing apparatus, and non-transitory computer readable medium
US10002588B2 (en) 2015-03-20 2018-06-19 Microsoft Technology Licensing, Llc Electronic paper display device
US10013060B2 (en) 2015-09-18 2018-07-03 Immersion Corporation Systems and methods for providing haptic effects in response to deformation of a cover for an electronic device
DE102017010327A1 (en) 2017-11-08 2018-07-05 Daimler Ag Device for displaying visual information for a user
US10067312B2 (en) 2011-11-22 2018-09-04 Cognex Corporation Vision system camera with mount for multiple lens types
US10075630B2 (en) 2013-07-03 2018-09-11 HJ Laboratories, LLC Providing real-time, personal services by accessing components on a mobile device
US10324675B2 (en) * 2011-04-25 2019-06-18 Sony Corporation Communication apparatus, communication control method, and computer-readable storage mediuim
US10324559B2 (en) 2015-09-01 2019-06-18 Japan Display Inc. Display device unit, control device, and image display panel
WO2019153783A1 (en) * 2018-02-08 2019-08-15 华南理工大学 Dynamic dance image capture and restoration system based on flexible sensor, and control method
US10498934B2 (en) 2011-11-22 2019-12-03 Cognex Corporation Camera system with exchangeable illumination assembly
US10528853B1 (en) * 2012-06-29 2020-01-07 Amazon Technologies, Inc. Shape-Based Edge Detection
USD875096S1 (en) * 2017-11-28 2020-02-11 Samsung Display Co., Ltd. Display device
US10621893B2 (en) 2017-03-30 2020-04-14 Sharp Kabushiki Kaisha Display device, manufacturing method for display device, manufacturing apparatus of display device, mounting device, and controller
US10692345B1 (en) * 2019-03-20 2020-06-23 Bi Incorporated Systems and methods for textural zone monitoring
US10720082B1 (en) 2016-09-08 2020-07-21 Ctskh, Llc Device and system to teach stem lessons using hands-on learning method
CN111684387A (en) * 2017-12-29 2020-09-18 马里亚·弗朗西斯卡·琼斯 Display device
US20210062332A1 (en) * 2018-04-25 2021-03-04 Aixtron Se Component coated with multiple two-dimensional layers, and coating method
US11016557B2 (en) 2009-08-28 2021-05-25 Home Depot Product Authority, Llc Method and system for creating a personalized experience in connection with a stored value token
US11366284B2 (en) 2011-11-22 2022-06-21 Cognex Corporation Vision system camera with mount for multiple lens types and lens module for the same
US20220339529A1 (en) * 2021-04-22 2022-10-27 Jon L. Keener Multi-dimensional word spelling board game
US11936964B2 (en) 2021-09-06 2024-03-19 Cognex Corporation Camera system with exchangeable illumination assembly

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10452207B2 (en) 2005-05-18 2019-10-22 Power2B, Inc. Displays and information input devices
WO2008111079A2 (en) 2007-03-14 2008-09-18 Power2B, Inc. Interactive devices
US7782274B2 (en) 2006-06-09 2010-08-24 Cfph, Llc Folding multimedia display device
US8493344B2 (en) * 2009-06-07 2013-07-23 Apple Inc. Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface
US20120272180A1 (en) * 2011-04-20 2012-10-25 Nokia Corporation Method and apparatus for providing content flipping based on a scrolling operation
US8751971B2 (en) * 2011-06-05 2014-06-10 Apple Inc. Devices, methods, and graphical user interfaces for providing accessibility using a touch-sensitive surface
US20130120106A1 (en) 2011-11-16 2013-05-16 Motorola Mobility, Inc. Display device, corresponding systems, and methods therefor
KR20130056674A (en) * 2011-11-22 2013-05-30 삼성전자주식회사 Flexible display apparatus and method for providing user interface by using the same
US8669956B2 (en) 2011-12-30 2014-03-11 Lg Electronics Inc. Bending threshold and release for a flexible display device
US8881269B2 (en) 2012-03-31 2014-11-04 Apple Inc. Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader
CN104254871B (en) * 2012-04-23 2017-10-27 英派尔科技开发有限公司 Skew control deformable display
US20130285921A1 (en) * 2012-04-25 2013-10-31 Motorola Mobility, Inc. Systems and Methods for a Rollable Illumination Device
KR101901611B1 (en) * 2012-05-09 2018-09-27 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
KR102052370B1 (en) 2012-06-14 2020-01-08 엘지전자 주식회사 Flexible Portable Device
KR102104588B1 (en) 2012-07-11 2020-04-24 삼성전자주식회사 Flexible display apparatus and operating method thereof
EP2701357B1 (en) * 2012-08-20 2017-08-02 Alcatel Lucent A method for establishing an authorized communication between a physical object and a communication device
US9329679B1 (en) * 2012-08-23 2016-05-03 Amazon Technologies, Inc. Projection system with multi-surface projection screen
US9632539B2 (en) 2012-10-16 2017-04-25 At&T Intellectual Property I, L.P Automatic shape adjustment of flexible display
USD795854S1 (en) * 2012-11-09 2017-08-29 Samsung Display Co., Ltd. Mobile phone
KR101909492B1 (en) 2012-12-27 2018-12-18 삼성전자주식회사 Method for interacting with flexible device and user terminal thereof
KR20140092059A (en) * 2013-01-15 2014-07-23 삼성전자주식회사 Method for controlling portable device equipped with flexible display and portable device thereof
US9622365B2 (en) 2013-02-25 2017-04-11 Google Technology Holdings LLC Apparatus and methods for accommodating a display in an electronic device
US9674922B2 (en) 2013-03-14 2017-06-06 Google Technology Holdings LLC Display side edge assembly and mobile device including same
CN105579905A (en) * 2013-05-02 2016-05-11 汤姆逊许可公司 Rear projection system with a foldable projection screen for mobile devices
US9484001B2 (en) 2013-12-23 2016-11-01 Google Technology Holdings LLC Portable electronic device controlling diffuse light source to emit light approximating color of object of user interest
KR102224478B1 (en) 2014-04-15 2021-03-08 엘지전자 주식회사 Flexible display device with touch sensitive surface and Method for controlling the same
US9575512B2 (en) 2014-04-15 2017-02-21 Lg Electronics Inc. Flexible touch sensitive display device and control method thereof
JP6079695B2 (en) * 2014-05-09 2017-02-15 コニカミノルタ株式会社 Image display photographing system, photographing device, display device, image display and photographing method, and computer program
KR102262721B1 (en) 2014-05-12 2021-06-09 엘지전자 주식회사 Foldable display device and method for controlling the same
KR102210632B1 (en) 2014-06-09 2021-02-02 엘지전자 주식회사 The Apparatus and Method for Display Device executing bending operation without physical bending
KR102246554B1 (en) 2014-06-18 2021-04-30 엘지전자 주식회사 Portable display device and controlling method thereof
US10613585B2 (en) * 2014-06-19 2020-04-07 Samsung Electronics Co., Ltd. Transparent display apparatus, group play system using transparent display apparatus and performance methods thereof
US9999280B2 (en) 2014-06-27 2018-06-19 David Gareth Zebley Interactive bracelet for practicing an activity between user devices
KR102243655B1 (en) 2014-09-02 2021-04-23 엘지전자 주식회사 Display apparatus and controlling method thereof
KR102342462B1 (en) 2014-12-08 2021-12-27 삼성디스플레이 주식회사 Rollable display device
KR102277260B1 (en) 2014-12-29 2021-07-14 엘지전자 주식회사 Terminal device and controlling method thereof
KR20160080851A (en) 2014-12-29 2016-07-08 엘지전자 주식회사 Display apparatus and controlling method thereof
US9626785B2 (en) 2015-03-23 2017-04-18 International Business Machines Corporation Using a bending pattern to arrange files on a flexible display
JP6459705B2 (en) 2015-03-27 2019-01-30 セイコーエプソン株式会社 Interactive projector, interactive projection system, and interactive projector control method
US10691207B2 (en) 2015-09-22 2020-06-23 Hewlett-Packard Development Company, L.P. Display devices with virtual reprsentations of electronic devices
US9904447B2 (en) 2016-01-08 2018-02-27 Microsoft Technology Licensing, Llc Universal inking support
US10048790B2 (en) 2016-03-15 2018-08-14 International Business Machines Corporation Digital object sharing using a flexible display
DK201670580A1 (en) 2016-06-12 2018-01-02 Apple Inc Wrist-based tactile time feedback for non-sighted users
CN107038710B (en) * 2017-02-15 2019-07-02 长安大学 It is a kind of using paper as the Vision Tracking of target
US10996761B2 (en) 2019-06-01 2021-05-04 Apple Inc. User interfaces for non-visual output of time

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6630922B2 (en) * 1997-08-29 2003-10-07 Xerox Corporation Handedness detection for a physical manipulatory grammar
US20040119716A1 (en) * 2002-12-20 2004-06-24 Chang Joon Park Apparatus and method for high-speed marker-free motion capture
US6937210B1 (en) * 2002-11-06 2005-08-30 The United States Of America As Represented By The Secretary Of Commerce Projecting images on a sphere
US20070091178A1 (en) * 2005-10-07 2007-04-26 Cotter Tim S Apparatus and method for performing motion capture using a random pattern on capture surfaces

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6275219B1 (en) 1993-08-23 2001-08-14 Ncr Corporation Digitizing projection display
US5996082A (en) 1995-10-16 1999-11-30 Packard Bell Nec System and method for delaying a wake-up signal
EP0849697B1 (en) 1996-12-20 2003-02-12 Hitachi Europe Limited A hand gesture recognition system and method
US6297805B1 (en) * 1997-08-29 2001-10-02 Xerox Corporation Multiple interacting computers interfaceable through a physical manipulatory grammar
US6243075B1 (en) 1997-08-29 2001-06-05 Xerox Corporation Graspable device manipulation for controlling a computer display
US6340957B1 (en) * 1997-08-29 2002-01-22 Xerox Corporation Dynamically relocatable tileable displays
US6268857B1 (en) * 1997-08-29 2001-07-31 Xerox Corporation Computer user interface using a physical manipulatory grammar
US6297838B1 (en) * 1997-08-29 2001-10-02 Xerox Corporation Spinning as a morpheme for a physical manipulatory grammar
US6160540A (en) * 1998-01-12 2000-12-12 Xerox Company Zoomorphic computer user interface
US7614008B2 (en) 2004-07-30 2009-11-03 Apple Inc. Operation of a computer with touch screen interface
US7800592B2 (en) 2005-03-04 2010-09-21 Apple Inc. Hand held electronic device with multiple touch sensing devices
US6256019B1 (en) 1999-03-30 2001-07-03 Eremote, Inc. Methods of using a controller for controlling multi-user access to the functionality of consumer devices
US6757002B1 (en) 1999-11-04 2004-06-29 Hewlett-Packard Development Company, L.P. Track pad pointing device with areas of specialized function
US20020004749A1 (en) 2000-02-09 2002-01-10 Froseth Barrie R. Customized food selection, ordering and distribution system and method
AU2001253161A1 (en) 2000-04-04 2001-10-15 Stick Networks, Inc. Method and apparatus for scheduling presentation of digital content on a personal communication device
WO2002019079A1 (en) 2000-08-31 2002-03-07 Sony Corporation Information recorded medium, information display, information providing device, and information providing system
US7918808B2 (en) 2000-09-20 2011-04-05 Simmons John C Assistive clothing
US6764652B2 (en) 2001-01-24 2004-07-20 The Regents Of The University Of Michigan Micromachined device for receiving and retaining at least one liquid droplet, method of making the device and method of using the device
US6870519B2 (en) 2001-03-28 2005-03-22 Intel Corporation Methods for tiling multiple display elements to form a single display
CA2354256A1 (en) 2001-07-17 2003-01-17 Charles A. Annand Predetermined ordering system
US7345671B2 (en) 2001-10-22 2008-03-18 Apple Inc. Method and apparatus for use of rotational user inputs
US20030098857A1 (en) * 2001-11-28 2003-05-29 Palm, Inc. Detachable flexible and expandable display with touch sensor apparatus and method
US7456823B2 (en) * 2002-06-14 2008-11-25 Sony Corporation User interface apparatus and portable information apparatus
US20050075923A1 (en) 2003-03-14 2005-04-07 E. & J. Gallo Winery Method and apparatus for managing product planning and marketing
US7196689B2 (en) * 2003-03-31 2007-03-27 Canon Kabushiki Kaisha Information device
JP2005174006A (en) * 2003-12-11 2005-06-30 Canon Inc Display device
US20050146507A1 (en) 2004-01-06 2005-07-07 Viredaz Marc A. Method and apparatus for interfacing with a graphical user interface using a control interface
US7401300B2 (en) 2004-01-09 2008-07-15 Nokia Corporation Adaptive user interface input device
US7176888B2 (en) 2004-03-23 2007-02-13 Fujitsu Limited Selective engagement of motion detection
US20060007135A1 (en) 2004-07-06 2006-01-12 Kazuyuki Imagawa Image display device and viewing intention judging device
US20060036395A1 (en) 2004-07-30 2006-02-16 Shaya Steven A Method and apparatus for measuring and controlling food intake of an individual
US7724242B2 (en) 2004-08-06 2010-05-25 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US7728821B2 (en) 2004-08-06 2010-06-01 Touchtable, Inc. Touch detecting interactive display
US7719523B2 (en) 2004-08-06 2010-05-18 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
US20070046643A1 (en) 2004-08-06 2007-03-01 Hillis W Daniel State-Based Approach to Gesture Identification
US7728823B2 (en) 2004-09-24 2010-06-01 Apple Inc. System and method for processing raw data of track pad device
US7163311B2 (en) 2004-10-22 2007-01-16 Kramer James F Foodware having visual sensory stimulating or sensing means
US7417417B2 (en) 2005-04-22 2008-08-26 Don Patrick Williams Spill-resistant beverage container with detection and notification indicator
US20060267966A1 (en) 2005-05-24 2006-11-30 Microsoft Corporation Hover widgets: using the tracking state to extend capabilities of pen-operated devices
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20100045705A1 (en) 2006-03-30 2010-02-25 Roel Vertegaal Interaction techniques for flexible displays
US7770136B2 (en) 2007-01-24 2010-08-03 Microsoft Corporation Gesture recognition interactive feedback
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6630922B2 (en) * 1997-08-29 2003-10-07 Xerox Corporation Handedness detection for a physical manipulatory grammar
US6937210B1 (en) * 2002-11-06 2005-08-30 The United States Of America As Represented By The Secretary Of Commerce Projecting images on a sphere
US20040119716A1 (en) * 2002-12-20 2004-06-24 Chang Joon Park Apparatus and method for high-speed marker-free motion capture
US20070091178A1 (en) * 2005-10-07 2007-04-26 Cotter Tim S Apparatus and method for performing motion capture using a random pattern on capture surfaces

Cited By (345)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8589920B2 (en) 2004-10-05 2013-11-19 Azul Systems, Inc. Resource allocation
US20100100888A1 (en) * 2004-10-05 2010-04-22 Azul Systems, Inc. Resource allocation
US20070082378A1 (en) * 2005-10-07 2007-04-12 University Of Chicago Convergent synthesis of proteins by kinetically controlled ligation
US7840898B2 (en) * 2005-11-01 2010-11-23 Microsoft Corporation Video booklet
US20070101268A1 (en) * 2005-11-01 2007-05-03 Microsoft Corporation Video booklet
US20070146243A1 (en) * 2005-12-27 2007-06-28 Lite-On Technology Corporation Variable-sized screen
US7440265B2 (en) * 2005-12-27 2008-10-21 Lite-On Technology Corporation Variable-sized screen
US20100045705A1 (en) * 2006-03-30 2010-02-25 Roel Vertegaal Interaction techniques for flexible displays
US9875122B2 (en) 2006-08-07 2018-01-23 Oracle International Corporation System and method for providing hardware virtualization in a virtual machine environment
US20080034365A1 (en) * 2006-08-07 2008-02-07 Bea Systems, Inc. System and method for providing hardware virtualization in a virtual machine environment
US8250572B2 (en) 2006-08-07 2012-08-21 Oracle International Corporation System and method for providing hardware virtualization in a virtual machine environment
US8806493B2 (en) 2006-08-07 2014-08-12 Oracle International Corporation System and method for providing hardware virtualization in a virtual machine environment
US20080158239A1 (en) * 2006-12-29 2008-07-03 X-Rite, Incorporated Surface appearance simulation
US9767599B2 (en) * 2006-12-29 2017-09-19 X-Rite Inc. Surface appearance simulation
US7945638B2 (en) * 2007-05-17 2011-05-17 Oracle International Corporation Ubiquitous content subscription and delivery via a smart electronic paper device
US20080288579A1 (en) * 2007-05-17 2008-11-20 Bea Systems, Inc. Ubiquitous Content Subscription and Delivery via a Smart Electronic Paper Device
US8646689B2 (en) 2007-12-28 2014-02-11 Cognex Corporation Deformable light pattern for machine vision system
US20090193348A1 (en) * 2008-01-30 2009-07-30 Microsoft Corporation Controlling an Integrated Messaging System Using Gestures
US8762892B2 (en) 2008-01-30 2014-06-24 Microsoft Corporation Controlling an integrated messaging system using gestures
US8286068B2 (en) * 2008-04-25 2012-10-09 Microsoft Corporation Linking digital and paper documents
US20090271691A1 (en) * 2008-04-25 2009-10-29 Microsoft Corporation Linking digital and paper documents
US20090281629A1 (en) * 2008-05-05 2009-11-12 Christian Roebling Intervertebral disc prosthesis
US20110054909A1 (en) * 2008-05-08 2011-03-03 Koninklijke Philips Electronics N.V. Localizing the position of a source of a voice signal
US8831954B2 (en) * 2008-05-08 2014-09-09 Nuance Communications, Inc. Localizing the position of a source of a voice signal
US20120229440A1 (en) * 2008-06-05 2012-09-13 Bindu Rama Rao E-paper based digital document display device that retrieves updates autmatically
US9443368B2 (en) * 2008-06-05 2016-09-13 Bindu Rama Rao E-paper based digital document display device that retrieves updates automatically
US8641203B2 (en) 2008-06-17 2014-02-04 The Invention Science Fund I, Llc Methods and systems for receiving and transmitting signals between server and projector apparatuses
US8267526B2 (en) 2008-06-17 2012-09-18 The Invention Science Fund I, Llc Methods associated with receiving and transmitting information related to projection
US8262236B2 (en) 2008-06-17 2012-09-11 The Invention Science Fund I, Llc Systems and methods for transmitting information associated with change of a projection surface
US8733952B2 (en) 2008-06-17 2014-05-27 The Invention Science Fund I, Llc Methods and systems for coordinated use of two or more user responsive projectors
US8723787B2 (en) 2008-06-17 2014-05-13 The Invention Science Fund I, Llc Methods and systems related to an image capture projection surface
US8820939B2 (en) 2008-06-17 2014-09-02 The Invention Science Fund I, Llc Projection associated methods and systems
US8857999B2 (en) 2008-06-17 2014-10-14 The Invention Science Fund I, Llc Projection in response to conformation
US8608321B2 (en) 2008-06-17 2013-12-17 The Invention Science Fund I, Llc Systems and methods for projecting in response to conformation
US8602564B2 (en) 2008-06-17 2013-12-10 The Invention Science Fund I, Llc Methods and systems for projecting in response to position
US8955984B2 (en) 2008-06-17 2015-02-17 The Invention Science Fund I, Llc Projection associated methods and systems
US8308304B2 (en) 2008-06-17 2012-11-13 The Invention Science Fund I, Llc Systems associated with receiving and transmitting information related to projection
US8936367B2 (en) 2008-06-17 2015-01-20 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8376558B2 (en) 2008-06-17 2013-02-19 The Invention Science Fund I, Llc Systems and methods for projecting in response to position change of a projection surface
US8540381B2 (en) 2008-06-17 2013-09-24 The Invention Science Fund I, Llc Systems and methods for receiving information associated with projecting
US8939586B2 (en) 2008-06-17 2015-01-27 The Invention Science Fund I, Llc Systems and methods for projecting in response to position
US8384005B2 (en) 2008-06-17 2013-02-26 The Invention Science Fund I, Llc Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface
US8944608B2 (en) 2008-06-17 2015-02-03 The Invention Science Fund I, Llc Systems and methods associated with projecting in response to conformation
US8430515B2 (en) 2008-06-17 2013-04-30 The Invention Science Fund I, Llc Systems and methods for projecting
US8403501B2 (en) 2008-06-17 2013-03-26 The Invention Science Fund, I, LLC Motion responsive devices and systems
WO2010004080A1 (en) * 2008-07-10 2010-01-14 Nokia Corporation User interface, device and method for a physically flexible device
US20100011291A1 (en) * 2008-07-10 2010-01-14 Nokia Corporation User interface, device and method for a physically flexible device
EP2613234A1 (en) * 2008-07-10 2013-07-10 Nokia Corporation User interface, device and method for a physically flexible device
US20100026719A1 (en) * 2008-07-31 2010-02-04 Sony Corporation Information processing apparatus, method, and program
US20100026642A1 (en) * 2008-07-31 2010-02-04 Samsung Electronics Co., Ltd. User interface apparatus and method using pattern recognition in handy terminal
US8847977B2 (en) * 2008-07-31 2014-09-30 Sony Corporation Information processing apparatus to flip image and display additional information, and associated methodology
US8554286B2 (en) 2008-08-04 2013-10-08 HJ Laboratories, LLC Mobile electronic device adaptively responsive to motion and user based controls
US9684341B2 (en) 2008-08-04 2017-06-20 Apple Inc. Mobile electronic device with an adaptively responsive flexible display
US20100029335A1 (en) * 2008-08-04 2010-02-04 Harry Vartanian Apparatus and method for communicating multimedia documents or content over a wireless network to a digital periodical or advertising device
US8855727B2 (en) 2008-08-04 2014-10-07 Apple Inc. Mobile electronic device with an adaptively responsive flexible display
US9332113B2 (en) 2008-08-04 2016-05-03 Apple Inc. Mobile electronic device with an adaptively responsive flexible display
US8346319B2 (en) 2008-08-04 2013-01-01 HJ Laboratories, LLC Providing a converted document to multimedia messaging service (MMS) messages
US10241543B2 (en) 2008-08-04 2019-03-26 Apple Inc. Mobile electronic device with an adaptively responsive flexible display
US10802543B2 (en) 2008-08-04 2020-10-13 Apple Inc. Mobile electronic device with an adaptively responsive flexible display
US11385683B2 (en) 2008-08-04 2022-07-12 Apple Inc. Mobile electronic device with an adaptively responsive flexible display
US8068886B2 (en) 2008-08-04 2011-11-29 HJ Laboratories, LLC Apparatus and method for providing an electronic device having adaptively responsive displaying of information
US20110183722A1 (en) * 2008-08-04 2011-07-28 Harry Vartanian Apparatus and method for providing an electronic device having a flexible display
US8396517B2 (en) 2008-08-04 2013-03-12 HJ Laboratories, LLC Mobile electronic device adaptively responsive to advanced motion
US7953462B2 (en) 2008-08-04 2011-05-31 Vartanian Harry Apparatus and method for providing an adaptively responsive flexible display device
US20100103123A1 (en) * 2008-08-29 2010-04-29 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable electronic device status information system and method
US20100053076A1 (en) * 2008-08-29 2010-03-04 Searete Llc Display control based on bendable interface containing electronic device conformation sequence status
US20100053174A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible interface e-paper conformation
US20100053071A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible display containing electronic device conformation
US9176637B2 (en) * 2008-08-29 2015-11-03 Invention Science Fund I, Llc Display control based on bendable interface containing electronic device conformation sequence status
US20100053074A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control based on bendable display containing electronic device conformation sequence status
US20100053068A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable electronic device status information system and method
US20100053073A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control based on bendable display containing electronic device conformation sequence status
US20140340306A1 (en) * 2008-08-29 2014-11-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable Electronic Device Status Information System and Method
US8866731B2 (en) 2008-08-29 2014-10-21 The Invention Science Fund I, Llc E-paper display control of classified content based on e-paper conformation
US20100056214A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Application control based on flexible interface conformation sequence status
US20100051680A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Application control based on flexible electronic device conformation sequence status
US8235280B2 (en) 2008-08-29 2012-08-07 The Invention Science Fund I, Llc E-paper display control of classified content based on E-paper conformation
US8240548B2 (en) * 2008-08-29 2012-08-14 The Invention Science Fund I, Llc Display control of classified content based on flexible display containing electronic device conformation
US20100053207A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Application control based on flexible electronic device conformation sequence status
US8251278B2 (en) 2008-08-29 2012-08-28 The Invention Science Fund I, Llc Display control based on bendable display containing electronic device conformation sequence status
US20100053067A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on e-paper conformation
US20100053217A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on E-paper conformation
US9411375B2 (en) * 2008-08-29 2016-08-09 Invention Science Fund I, Llc Bendable electronic device status information system and method
US8272571B2 (en) 2008-08-29 2012-09-25 The Invention Science Fund I, Llc E-paper display control of classified content based on e-paper conformation
US8777099B2 (en) * 2008-08-29 2014-07-15 The Invention Science Fund I, Llc Bendable electronic device status information system and method
US20100053075A1 (en) * 2008-08-29 2010-03-04 Searete Llc Display control based on bendable interface containing electronic device conformation sequence status
US20100053122A1 (en) * 2008-08-29 2010-03-04 Searete Llc., A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible interface E-paper conformation
US20100053173A1 (en) * 2008-08-29 2010-03-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Display control of classified content based on flexible display containing electronic device conformation
US8297495B2 (en) 2008-08-29 2012-10-30 The Invention Science Fund I, Llc Application control based on flexible interface conformation sequence status
US8708220B2 (en) * 2008-08-29 2014-04-29 The Invention Science Fund I, Llc Display control based on bendable interface containing electronic device conformation sequence status
US8322599B2 (en) 2008-08-29 2012-12-04 The Invention Science Fund I, Llc Display control of classified content based on flexible interface e-paper conformation
US8646693B2 (en) * 2008-08-29 2014-02-11 The Invention Science Fund I, Llc Application control based on flexible electronic device conformation sequence status
US20100060565A1 (en) * 2008-08-29 2010-03-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on e-paper conformation
US20100117955A1 (en) * 2008-08-29 2010-05-13 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US8613394B2 (en) * 2008-08-29 2013-12-24 The Invention Science Fund I, Llc Bendable electronic interface external control system and method
US20170068277A1 (en) * 2008-08-29 2017-03-09 Searete Llc Bendable Electronic Device Status Information System and Method
US8393531B2 (en) 2008-08-29 2013-03-12 The Invention Science Fund I, Llc Application control based on flexible electronic device conformation sequence status
US20100073278A1 (en) * 2008-08-29 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper application control based on conformation sequence status
US8596521B2 (en) 2008-08-29 2013-12-03 The Invention Science Fund I, Llc E-paper display control based on conformation sequence status
US20100085300A1 (en) * 2008-08-29 2010-04-08 Cohen Alexander J Bendable electronic interface external control system and method
US8544722B2 (en) * 2008-08-29 2013-10-01 The Invention Science Fund I, Llc Bendable electronic interface external control system and method
US20100085301A1 (en) * 2008-08-29 2010-04-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Bendable electronic interface external control system and method
US8462104B2 (en) * 2008-08-29 2013-06-11 The Invention Science Fund I, Llc E-paper display control based on conformation sequence status
US8466870B2 (en) 2008-08-29 2013-06-18 The Invention Science Fund, I, LLC E-paper application control based on conformation sequence status
US8517251B2 (en) * 2008-08-29 2013-08-27 The Invention Science Fund I, Llc Application control based on flexible interface conformation sequence status
US8511563B2 (en) * 2008-08-29 2013-08-20 The Invention Science Fund I, Llc Display control of classified content based on flexible interface E-paper conformation
US8500002B2 (en) * 2008-08-29 2013-08-06 The Invention Science Fund I, Llc Display control based on bendable display containing electronic device conformation sequence status
US20100091008A1 (en) * 2008-08-29 2010-04-15 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US8490860B2 (en) * 2008-08-29 2013-07-23 The Invention Science Fund I, Llc Display control of classified content based on flexible display containing electronic device conformation
US8485426B2 (en) 2008-08-29 2013-07-16 The Invention Science Fund I, Llc Bendable electronic device status information system and method
US20100056223A1 (en) * 2008-09-02 2010-03-04 Choi Kil Soo Mobile terminal equipped with flexible display and controlling method thereof
KR101472021B1 (en) * 2008-09-02 2014-12-24 엘지전자 주식회사 Mobile terminal equipped with flexible display and controlling method thereof
US8543166B2 (en) * 2008-09-02 2013-09-24 Lg Electronics Inc. Mobile terminal equipped with flexible display and controlling method thereof
US20100060564A1 (en) * 2008-09-11 2010-03-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control of classified content based on e-paper conformation
US8624833B2 (en) * 2008-09-11 2014-01-07 The Invention Science Fund I, Llc E-paper display control of classified content based on e-paper conformation
US20100073263A1 (en) * 2008-09-22 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware, E-Paper application control based on conformation sequence status
US20100073333A1 (en) * 2008-09-22 2010-03-25 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper application control based on conformation sequence status
US20100073334A1 (en) * 2008-09-25 2010-03-25 Cohen Alexander J E-paper application control based on conformation sequence status
CN102165394A (en) * 2008-09-30 2011-08-24 微软公司 Using physical objects in conjunction with an interactive surface
US8427424B2 (en) 2008-09-30 2013-04-23 Microsoft Corporation Using physical objects in conjunction with an interactive surface
US9372552B2 (en) 2008-09-30 2016-06-21 Microsoft Technology Licensing, Llc Using physical objects in conjunction with an interactive surface
US20100079369A1 (en) * 2008-09-30 2010-04-01 Microsoft Corporation Using Physical Objects in Conjunction with an Interactive Surface
WO2010039349A3 (en) * 2008-09-30 2010-05-27 Microsoft Corporation Using physical objects in conjunction with an interactive surface
US10346529B2 (en) 2008-09-30 2019-07-09 Microsoft Technology Licensing, Llc Using physical objects in conjunction with an interactive surface
US20100085277A1 (en) * 2008-10-07 2010-04-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US9035870B2 (en) * 2008-10-07 2015-05-19 The Invention Science Fund I, Llc E-paper display control based on conformation sequence status
US20100085298A1 (en) * 2008-10-07 2010-04-08 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US8446357B2 (en) * 2008-10-07 2013-05-21 The Invention Science Fund I, Llc E-paper display control based on conformation sequence status
US20100090991A1 (en) * 2008-10-10 2010-04-15 Searete Llc, A Limited Liability Corporation Of The State Of Delaware. E-Paper display control based on conformation sequence status
US8493336B2 (en) 2008-10-10 2013-07-23 The Invention Science Fund I, Llc E-paper display control based on conformation sequence status
WO2010041227A1 (en) * 2008-10-12 2010-04-15 Barit, Efrat Flexible devices and related methods of use
US11726521B2 (en) 2008-10-12 2023-08-15 Samsung Electronics Co., Ltd. Flexible devices and related methods of use
US10339892B2 (en) 2008-10-12 2019-07-02 Samsung Electronics Co., Ltd. Flexible devices and related methods of use
US11294424B2 (en) 2008-10-12 2022-04-05 Samsung Electronics Co., Ltd. Flexible devices and related methods of use
US20140035872A1 (en) * 2008-10-24 2014-02-06 Samsung Electronics Co., Ltd. Input device for foldable display device and input method thereof
US10055031B2 (en) * 2008-10-24 2018-08-21 Samsung Electronics Co., Ltd. Input device for foldable display device and input method thereof
US20100117954A1 (en) * 2008-11-07 2010-05-13 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper display control based on conformation sequence status
US8584930B2 (en) * 2008-11-07 2013-11-19 The Invention Science Fund I, Llc E-paper display control based on conformation sequence status
US8786574B2 (en) 2008-11-14 2014-07-22 The Invention Science Fund I, Llc E-paper external control system and method
US20100123689A1 (en) * 2008-11-14 2010-05-20 Searete Llc, A Limited Liability Corporation Of The State Of Delaware E-paper external control system and method
US20100124879A1 (en) * 2008-11-14 2010-05-20 Searete Llc. E-paper external control system and method
US8279199B2 (en) 2008-11-14 2012-10-02 The Invention Science Fund I, Llc E-paper external control system and method
US20100164888A1 (en) * 2008-12-26 2010-07-01 Sony Corporation Display device
US8581859B2 (en) * 2008-12-26 2013-11-12 Sony Corporation Display device
US8803060B2 (en) 2009-01-12 2014-08-12 Cognex Corporation Modular focus system alignment for image based readers
US20100214323A1 (en) * 2009-02-23 2010-08-26 Canon Kabushiki Kaisha Image processing system, image processing apparatus, display apparatus, method of controlling the same, and program
US20100229129A1 (en) * 2009-03-04 2010-09-09 Microsoft Corporation Creating organizational containers on a graphical user interface
US9459728B2 (en) 2009-03-18 2016-10-04 HJ Laboratories, LLC Mobile device with individually controllable tactile sensations
US8686951B2 (en) 2009-03-18 2014-04-01 HJ Laboratories, LLC Providing an elevated and texturized display in an electronic device
US9400558B2 (en) 2009-03-18 2016-07-26 HJ Laboratories, LLC Providing an elevated and texturized display in an electronic device
US9772772B2 (en) 2009-03-18 2017-09-26 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US10191652B2 (en) 2009-03-18 2019-01-29 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US9448632B2 (en) 2009-03-18 2016-09-20 Hj Laboratories Licensing, Llc Mobile device with a pressure and indentation sensitive multi-touch display
US9335824B2 (en) 2009-03-18 2016-05-10 HJ Laboratories, LLC Mobile device with a pressure and indentation sensitive multi-touch display
US8866766B2 (en) 2009-03-18 2014-10-21 HJ Laboratories, LLC Individually controlling a tactile area of an image displayed on a multi-touch display
US9405371B1 (en) 2009-03-18 2016-08-02 HJ Laboratories, LLC Controllable tactile sensations in a consumer device
US9778840B2 (en) 2009-03-18 2017-10-03 Hj Laboratories Licensing, Llc Electronic device with an interactive pressure sensitive multi-touch display
US9423905B2 (en) 2009-03-18 2016-08-23 Hj Laboratories Licensing, Llc Providing an elevated and texturized display in a mobile electronic device
US9547368B2 (en) 2009-03-18 2017-01-17 Hj Laboratories Licensing, Llc Electronic device with a pressure sensitive multi-touch display
US20100238114A1 (en) * 2009-03-18 2010-09-23 Harry Vartanian Apparatus and method for providing an elevated, indented, or texturized display device
US10367993B2 (en) * 2009-05-07 2019-07-30 Microsoft Technology Licensing, Llc Changing of list views on mobile device
US9507509B2 (en) * 2009-05-07 2016-11-29 Microsoft Technology Licensing, Llc Changing of list views on mobile device
US8669945B2 (en) 2009-05-07 2014-03-11 Microsoft Corporation Changing of list views on mobile device
US20140129995A1 (en) * 2009-05-07 2014-05-08 Microsoft Corporation Changing of list views on mobile device
US20100283743A1 (en) * 2009-05-07 2010-11-11 Microsoft Corporation Changing of list views on mobile device
US20100328267A1 (en) * 2009-06-30 2010-12-30 Hon Hai Precision Industry Co., Ltd. Optical touch device
WO2011019214A1 (en) * 2009-08-13 2011-02-17 University-Industry Cooperation Group Of Kyung Hee University Cooperative multi-display
US20110037742A1 (en) * 2009-08-13 2011-02-17 University-Industry Cooperation Group Of Kyung Hee University Cooperative multi-display
US8508471B2 (en) * 2009-08-13 2013-08-13 University-Industry Cooperation Group Of Kyung Hee University Cooperative multi-display
US20110047460A1 (en) * 2009-08-19 2011-02-24 Samsung Electronics Co., Ltd. Method and apparatus of electronic paper comprising a user interface
US9323378B2 (en) * 2009-08-19 2016-04-26 Samsung Electronics Co., Ltd. Method and apparatus of electronic paper comprising a user interface
US20110055049A1 (en) * 2009-08-28 2011-03-03 Home Depot U.S.A., Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US11016557B2 (en) 2009-08-28 2021-05-25 Home Depot Product Authority, Llc Method and system for creating a personalized experience in connection with a stored value token
US8645220B2 (en) 2009-08-28 2014-02-04 Homer Tlc, Inc. Method and system for creating an augmented reality experience in connection with a stored value token
US20110128238A1 (en) * 2009-11-27 2011-06-02 Lg Electronics Inc. Electric device and control method thereof
US9373123B2 (en) 2009-12-30 2016-06-21 Iheartmedia Management Services, Inc. Wearable advertising ratings methods and systems
US20110161160A1 (en) * 2009-12-30 2011-06-30 Clear Channel Management Services, Inc. System and method for monitoring audience in response to signage
US9047256B2 (en) 2009-12-30 2015-06-02 Iheartmedia Management Services, Inc. System and method for monitoring audience in response to signage
US10496170B2 (en) 2010-02-16 2019-12-03 HJ Laboratories, LLC Vehicle computing system to provide feedback
US20110199342A1 (en) * 2010-02-16 2011-08-18 Harry Vartanian Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound
US9509981B2 (en) 2010-02-23 2016-11-29 Microsoft Technology Licensing, Llc Projectors and depth cameras for deviceless augmented reality and interaction
US20110216091A1 (en) * 2010-03-04 2011-09-08 Song Hyunyoung Bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
US9128537B2 (en) * 2010-03-04 2015-09-08 Autodesk, Inc. Bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
US9983729B2 (en) 2010-05-21 2018-05-29 Nokia Technologies Oy Method, an apparatus and a computer program for controlling an output from a display of an apparatus
US8289352B2 (en) 2010-07-15 2012-10-16 HJ Laboratories, LLC Providing erasable printing with nanoparticles
US10962652B2 (en) 2010-10-08 2021-03-30 Samsung Electronics Co., Ltd. Determining context of a mobile computer
US9244173B1 (en) * 2010-10-08 2016-01-26 Samsung Electronics Co. Ltd. Determining context of a mobile computer
US9110159B2 (en) 2010-10-08 2015-08-18 HJ Laboratories, LLC Determining indoor location or position of a mobile computer using building information
US8174931B2 (en) 2010-10-08 2012-05-08 HJ Laboratories, LLC Apparatus and method for providing indoor location, position, or tracking of a mobile computer using building information
US9116230B2 (en) 2010-10-08 2015-08-25 HJ Laboratories, LLC Determining floor location and movement of a mobile computer in a building
US10107916B2 (en) 2010-10-08 2018-10-23 Samsung Electronics Co., Ltd. Determining context of a mobile computer
US8842496B2 (en) 2010-10-08 2014-09-23 HJ Laboratories, LLC Providing indoor location, position, or tracking of a mobile computer using a room dimension
US9182494B2 (en) 2010-10-08 2015-11-10 HJ Laboratories, LLC Tracking a mobile computer indoors using wi-fi and motion sensor information
US8284100B2 (en) 2010-10-08 2012-10-09 HJ Laboratories, LLC Providing indoor location, position, or tracking of a mobile computer using sensors
US9684079B2 (en) 2010-10-08 2017-06-20 Samsung Electronics Co., Ltd. Determining context of a mobile computer
US8395968B2 (en) 2010-10-08 2013-03-12 HJ Laboratories, LLC Providing indoor location, position, or tracking of a mobile computer using building information
US9176230B2 (en) 2010-10-08 2015-11-03 HJ Laboratories, LLC Tracking a mobile computer indoors using Wi-Fi, motion, and environmental sensors
US20120159373A1 (en) * 2010-12-15 2012-06-21 Verizon Patent And Licensing, Inc. System for and method of generating dog ear bookmarks on a touch screen device
US9418479B1 (en) * 2010-12-23 2016-08-16 Amazon Technologies, Inc. Quasi-virtual objects in an augmented reality environment
US20120188153A1 (en) * 2011-01-21 2012-07-26 Research In Motion Corporation Multi-bend display activation adaptation
US9552127B2 (en) * 2011-01-21 2017-01-24 Blackberry Limited Multi-bend display activation adaptation
US20140068473A1 (en) * 2011-01-21 2014-03-06 Blackberry Limited Multi-bend display activation adaptation
US8587539B2 (en) * 2011-01-21 2013-11-19 Blackberry Limited Multi-bend display activation adaptation
US9329469B2 (en) 2011-02-17 2016-05-03 Microsoft Technology Licensing, Llc Providing an interactive experience using a 3D depth camera and a 3D projector
US9480907B2 (en) 2011-03-02 2016-11-01 Microsoft Technology Licensing, Llc Immersive display with peripheral illusions
US9721489B2 (en) 2011-03-21 2017-08-01 HJ Laboratories, LLC Providing augmented reality based on third party information
US10348875B2 (en) 2011-03-21 2019-07-09 Apple Inc. Electronic devices with convex displays
US8934228B2 (en) 2011-03-21 2015-01-13 Apple Inc. Display-based speaker structures for electronic devices
US10088927B2 (en) 2011-03-21 2018-10-02 Apple Inc. Electronic devices with flexible displays
US11394815B2 (en) 2011-03-21 2022-07-19 Apple Inc. Electronic devices with convex displays
US9178970B2 (en) 2011-03-21 2015-11-03 Apple Inc. Electronic devices with convex displays
US8743244B2 (en) 2011-03-21 2014-06-03 HJ Laboratories, LLC Providing augmented reality based on third party information
US10931802B2 (en) 2011-03-21 2021-02-23 Apple Inc. Electronic devices with concave displays
US9866660B2 (en) 2011-03-21 2018-01-09 Apple Inc. Electronic devices with concave displays
US8816977B2 (en) 2011-03-21 2014-08-26 Apple Inc. Electronic devices with flexible displays
US9756158B2 (en) 2011-03-21 2017-09-05 Apple Inc. Electronic devices with convex displays
US10735569B2 (en) 2011-03-21 2020-08-04 Apple Inc. Electronic devices with convex displays
US10324675B2 (en) * 2011-04-25 2019-06-18 Sony Corporation Communication apparatus, communication control method, and computer-readable storage mediuim
US9597587B2 (en) 2011-06-08 2017-03-21 Microsoft Technology Licensing, Llc Locational node device
US9400576B2 (en) 2011-07-19 2016-07-26 Apple Inc. Touch sensor arrangements for organic light-emitting diode displays
US9939978B2 (en) 2011-07-19 2018-04-10 Apple Inc Touch sensitive displays
US11487330B2 (en) 2011-09-26 2022-11-01 Apple Inc. Electronic device with wrap around display
US8665236B2 (en) 2011-09-26 2014-03-04 Apple Inc. Electronic device with wrap around display
US10345860B2 (en) 2011-09-26 2019-07-09 Apple Inc. Electronic device with wrap around display
US11137799B2 (en) 2011-09-26 2021-10-05 Apple Inc. Electronic device with wrap around display
US10318061B2 (en) 2011-09-30 2019-06-11 Apple Inc. Flexible electronic devices
US9557874B2 (en) 2011-09-30 2017-01-31 Apple Inc. Flexible electronic devices
US9274562B2 (en) 2011-09-30 2016-03-01 Apple Inc. Flexible electronic devices
US20130085849A1 (en) * 2011-09-30 2013-04-04 Matthew G. Dyor Presenting opportunities for commercialization in a gesture-based user interface
US11675390B2 (en) 2011-09-30 2023-06-13 Apple Inc. Flexible electronic devices
US9971448B2 (en) 2011-09-30 2018-05-15 Apple Inc. Flexible electronic devices
US8929085B2 (en) 2011-09-30 2015-01-06 Apple Inc. Flexible electronic devices
US10739908B2 (en) 2011-09-30 2020-08-11 Apple Inc. Flexible electronic devices
US9007300B2 (en) 2011-10-14 2015-04-14 Blackberry Limited Method and system to control a process with bend movements
US10678019B2 (en) 2011-11-22 2020-06-09 Cognex Corporation Vision system camera with mount for multiple lens types
US10498933B2 (en) 2011-11-22 2019-12-03 Cognex Corporation Camera system with exchangeable illumination assembly
US11366284B2 (en) 2011-11-22 2022-06-21 Cognex Corporation Vision system camera with mount for multiple lens types and lens module for the same
US10067312B2 (en) 2011-11-22 2018-09-04 Cognex Corporation Vision system camera with mount for multiple lens types
US11115566B2 (en) 2011-11-22 2021-09-07 Cognex Corporation Camera system with exchangeable illumination assembly
US11921350B2 (en) 2011-11-22 2024-03-05 Cognex Corporation Vision system camera with mount for multiple lens types and lens module for the same
US10498934B2 (en) 2011-11-22 2019-12-03 Cognex Corporation Camera system with exchangeable illumination assembly
US8963833B2 (en) * 2011-12-23 2015-02-24 Samsung Electronics Co., Ltd. Method and apparatus for controlling flexible display in portable terminal
US20130162556A1 (en) * 2011-12-23 2013-06-27 Samsung Electronics Co., Ltd Method and apparatus for controlling flexible display in portable terminal
CN103176735A (en) * 2011-12-23 2013-06-26 三星电子株式会社 Method and apparatus for controlling flexible display in portable terminal
EP2607983A1 (en) * 2011-12-23 2013-06-26 Samsung Electronics Co., Ltd Method and apparatus for moving an object on a flexible display in a portable terminal
WO2013094866A1 (en) * 2011-12-23 2013-06-27 Samsung Electronics Co., Ltd. Method and apparatus for controlling flexible display in portable terminal
US9823707B2 (en) 2012-01-25 2017-11-21 Nokia Technologies Oy Contortion of an electronic apparatus
US20130201093A1 (en) * 2012-02-06 2013-08-08 Yongsin Kim Portable device and method for controlling the same
US8947354B2 (en) 2012-02-06 2015-02-03 Lg Electronics Inc. Portable device and method for controlling the same
US8952893B2 (en) 2012-02-06 2015-02-10 Lg Electronics Inc. Portable device and method for controlling the same
US9046918B2 (en) 2012-02-06 2015-06-02 Lg Electronics Inc. Portable device and method for controlling the same
US8610663B2 (en) * 2012-02-06 2013-12-17 Lg Electronics Inc. Portable device and method for controlling the same
US20130201115A1 (en) * 2012-02-08 2013-08-08 Immersion Corporation Method and apparatus for haptic flex gesturing
US10133401B2 (en) * 2012-02-08 2018-11-20 Immersion Corporation Method and apparatus for haptic flex gesturing
US9411423B2 (en) * 2012-02-08 2016-08-09 Immersion Corporation Method and apparatus for haptic flex gesturing
US20130215088A1 (en) * 2012-02-17 2013-08-22 Howon SON Electronic device including flexible display
US9672796B2 (en) * 2012-02-17 2017-06-06 Lg Electronics Inc. Electronic device including flexible display
US20130217496A1 (en) * 2012-02-20 2013-08-22 Jake Waldron Olkin Dynamic Game System And Associated Methods
US8821280B2 (en) * 2012-02-20 2014-09-02 Jake Waldron Olkin Dynamic game system and associated methods
US20130222222A1 (en) * 2012-02-24 2013-08-29 Nokia Corporation Method and apparatus for presenting multi-dimensional representations of an image dependent upon the shape of a display
WO2013136204A1 (en) * 2012-02-24 2013-09-19 Nokia Corporation A method, apparatus and computer program for displaying content on a foldable display
US9767605B2 (en) * 2012-02-24 2017-09-19 Nokia Technologies Oy Method and apparatus for presenting multi-dimensional representations of an image dependent upon the shape of a display
US9804734B2 (en) 2012-02-24 2017-10-31 Nokia Technologies Oy Method, apparatus and computer program for displaying content
US9823696B2 (en) 2012-04-27 2017-11-21 Nokia Technologies Oy Limiting movement
US10719972B2 (en) 2012-05-11 2020-07-21 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US9805494B2 (en) * 2012-05-11 2017-10-31 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US20190172248A1 (en) * 2012-05-11 2019-06-06 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
CN109240427A (en) * 2012-05-11 2019-01-18 株式会社半导体能源研究所 The display methods of electronic equipment
US11815956B2 (en) 2012-05-11 2023-11-14 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US10380783B2 (en) * 2012-05-11 2019-08-13 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US11216041B2 (en) 2012-05-11 2022-01-04 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US20130300732A1 (en) * 2012-05-11 2013-11-14 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US10467797B2 (en) 2012-05-11 2019-11-05 Semiconductor Energy Laboratory Co., Ltd. Electronic device, storage medium, program, and displaying method
US9578200B2 (en) 2012-05-24 2017-02-21 HJ Laboratories, LLC Detecting a document using one or more sensors
US9218526B2 (en) 2012-05-24 2015-12-22 HJ Laboratories, LLC Apparatus and method to detect a paper document using one or more sensors
US10599923B2 (en) 2012-05-24 2020-03-24 HJ Laboratories, LLC Mobile device utilizing multiple cameras
US9959464B2 (en) 2012-05-24 2018-05-01 HJ Laboratories, LLC Mobile device utilizing multiple cameras for environmental detection
US20150113480A1 (en) * 2012-06-27 2015-04-23 Oce-Technologies B.V. User interaction system for displaying digital objects
US20140002419A1 (en) * 2012-06-28 2014-01-02 Motorola Mobility Llc Systems and Methods for Processing Content Displayed on a Flexible Display
US8970455B2 (en) * 2012-06-28 2015-03-03 Google Technology Holdings LLC Systems and methods for processing content displayed on a flexible display
US11354879B1 (en) 2012-06-29 2022-06-07 Amazon Technologies, Inc. Shape-based edge detection
US10528853B1 (en) * 2012-06-29 2020-01-07 Amazon Technologies, Inc. Shape-Based Edge Detection
US9952706B2 (en) * 2012-07-30 2018-04-24 Samsung Electronics Co., Ltd. Flexible device for providing bending interaction guide and control method thereof
US9239647B2 (en) * 2012-08-20 2016-01-19 Samsung Electronics Co., Ltd Electronic device and method for changing an object according to a bending state
US20140111629A1 (en) * 2012-10-20 2014-04-24 Margaret Morris System for dynamic projection of media
US9389776B2 (en) 2012-12-20 2016-07-12 Samsung Display Co., Ltd. Switching complex, flexible display apparatus having the same and method of generating input signal using the same
US20140195898A1 (en) * 2013-01-04 2014-07-10 Roel Vertegaal Computing Apparatus
US9841867B2 (en) * 2013-01-04 2017-12-12 Roel Vertegaal Computing apparatus for displaying a plurality of electronic documents to a user
US9367894B2 (en) 2013-02-01 2016-06-14 Samsung Display Co., Ltd. Stretchable display and method of controlling the same
EP2763018A1 (en) * 2013-02-01 2014-08-06 Samsung Display Co., Ltd. Stretchable display and method of controlling the same
KR20140099128A (en) * 2013-02-01 2014-08-11 삼성디스플레이 주식회사 Stretchable display and method for controlling the same
KR102090711B1 (en) * 2013-02-01 2020-03-19 삼성디스플레이 주식회사 Stretchable display and method for controlling the same
US8966681B2 (en) 2013-02-26 2015-03-03 Linda L. Burch Exercise mat
EP2787425A3 (en) * 2013-04-02 2014-12-17 Samsung Display Co., Ltd. Optical detection of bending motions of a flexible display
US9990004B2 (en) 2013-04-02 2018-06-05 Samsung Dispaly Co., Ltd. Optical detection of bending motions of a flexible display
US9608216B2 (en) * 2013-05-30 2017-03-28 Samsung Display Co., Ltd. Flexible display device and method of manufacturing the same
US10075630B2 (en) 2013-07-03 2018-09-11 HJ Laboratories, LLC Providing real-time, personal services by accessing components on a mobile device
US20150033193A1 (en) * 2013-07-25 2015-01-29 Here Global B.V. Methods for modifying images and related aspects
US9766762B2 (en) * 2013-11-14 2017-09-19 Nokia Technologies Oy Flexible device deformation measurement
US20160109985A1 (en) * 2013-11-14 2016-04-21 Nokia Technologies Oy Flexible device deformation measurement
US9557862B2 (en) 2013-12-17 2017-01-31 Industrial Technology Research Institute Bend sensor, bend sensing method and bend sensing system for flexible display panel
US20150199941A1 (en) * 2014-01-15 2015-07-16 Nokia Corporation 3d touch sensor reader
US8988381B1 (en) * 2014-02-14 2015-03-24 Lg Electronics Inc. Mobile terminal and method for controlling the same
US9566404B2 (en) 2014-03-05 2017-02-14 General Electric Company Medical vaporizer
US11129951B2 (en) 2014-03-05 2021-09-28 General Electric Company Medical vaporizer
US10838503B2 (en) * 2014-07-31 2020-11-17 Hewlett-Packard Development Company, L.P. Virtual reality clamshell computing device
US20170212602A1 (en) * 2014-07-31 2017-07-27 Hewlett-Packard Development Company, L.P Virtual reality clamshell computing device
US20160048170A1 (en) * 2014-08-13 2016-02-18 Samsung Electronics Co., Ltd. Method and electronic device for processing image
US20170285849A1 (en) * 2014-09-05 2017-10-05 Samsung Electronics Co., Ltd. Touch screen panel, electronic notebook, and mobile terminal
US10379662B2 (en) * 2014-09-05 2019-08-13 Samsung Electronics Co., Ltd. Touch screen panel, electronic notebook, and mobile terminal
US20160098132A1 (en) * 2014-10-07 2016-04-07 Samsung Electronics Co., Ltd. Electronic device including flexible display
US10108230B2 (en) * 2014-10-07 2018-10-23 Samsung Electronics Co., Ltd Electronic device including flexible display
WO2016078266A1 (en) * 2014-11-18 2016-05-26 中兴通讯股份有限公司 Method and device for capturing image and storage medium
US10419742B2 (en) 2014-11-18 2019-09-17 Xi'an Zhongxing New Software Co. Ltd. Method and device for capturing image and storage medium
CN105678684A (en) * 2014-11-18 2016-06-15 中兴通讯股份有限公司 Image capture method and device
CN105678684B (en) * 2014-11-18 2020-11-03 中兴通讯股份有限公司 Method and device for intercepting image
US20160139674A1 (en) * 2014-11-19 2016-05-19 Kabushiki Kaisha Toshiba Information processing device, image projection device, and information processing method
US10002588B2 (en) 2015-03-20 2018-06-19 Microsoft Technology Licensing, Llc Electronic paper display device
US20160339337A1 (en) * 2015-05-21 2016-11-24 Castar, Inc. Retroreflective surface with integrated fiducial markers for an augmented reality system
US10324559B2 (en) 2015-09-01 2019-06-18 Japan Display Inc. Display device unit, control device, and image display panel
US10990340B2 (en) 2015-09-01 2021-04-27 Japan Display Inc. Display apparatus and control device
US10013060B2 (en) 2015-09-18 2018-07-03 Immersion Corporation Systems and methods for providing haptic effects in response to deformation of a cover for an electronic device
US10310614B2 (en) 2015-09-18 2019-06-04 Immersion Corporation Systems and methods for providing haptic effects in response to deformation of a cover for an electronic device
US10466793B2 (en) 2015-09-18 2019-11-05 Immersion Corporation Systems and methods for providing haptic effects in response to deformation of a cover for an electronic device
US20170351384A1 (en) * 2016-06-02 2017-12-07 Coretronic Corporation Touch display system and correciton method thereof
CN109313500A (en) * 2016-06-09 2019-02-05 微软技术许可有限责任公司 The passive optical and inertia of very thin form factor track
WO2017213969A1 (en) * 2016-06-09 2017-12-14 Microsoft Technology Licensing, Llc Modular extension of inertial controller for six dof mixed reality input
US20170357333A1 (en) 2016-06-09 2017-12-14 Alexandru Octavian Balan Passive optical and inertial tracking in slim form-factor
US10146335B2 (en) 2016-06-09 2018-12-04 Microsoft Technology Licensing, Llc Modular extension of inertial controller for six DOF mixed reality input
US10146334B2 (en) 2016-06-09 2018-12-04 Microsoft Technology Licensing, Llc Passive optical and inertial tracking in slim form-factor
WO2017213940A1 (en) * 2016-06-09 2017-12-14 Microsoft Technology Licensing, Llc Passive optical and inertial tracking in slim form-factor
US10720082B1 (en) 2016-09-08 2020-07-21 Ctskh, Llc Device and system to teach stem lessons using hands-on learning method
US20180150110A1 (en) * 2016-11-25 2018-05-31 Fuji Xerox Co., Ltd. Display apparatus, image processing apparatus, and non-transitory computer readable medium
CN106713747A (en) * 2016-11-29 2017-05-24 维沃移动通信有限公司 Focusing method and mobile terminal
US10621893B2 (en) 2017-03-30 2020-04-14 Sharp Kabushiki Kaisha Display device, manufacturing method for display device, manufacturing apparatus of display device, mounting device, and controller
DE102017010327A1 (en) 2017-11-08 2018-07-05 Daimler Ag Device for displaying visual information for a user
USD875096S1 (en) * 2017-11-28 2020-02-11 Samsung Display Co., Ltd. Display device
CN111684387A (en) * 2017-12-29 2020-09-18 马里亚·弗朗西斯卡·琼斯 Display device
WO2019153783A1 (en) * 2018-02-08 2019-08-15 华南理工大学 Dynamic dance image capture and restoration system based on flexible sensor, and control method
US20210062332A1 (en) * 2018-04-25 2021-03-04 Aixtron Se Component coated with multiple two-dimensional layers, and coating method
US10692345B1 (en) * 2019-03-20 2020-06-23 Bi Incorporated Systems and methods for textural zone monitoring
US11270564B2 (en) 2019-03-20 2022-03-08 Bi Incorporated Systems and methods for textual zone monitoring
US11837065B2 (en) 2019-03-20 2023-12-05 Bi Incorporated Systems and methods for textural zone monitoring
US20220339529A1 (en) * 2021-04-22 2022-10-27 Jon L. Keener Multi-dimensional word spelling board game
US11936964B2 (en) 2021-09-06 2024-03-19 Cognex Corporation Camera system with exchangeable illumination assembly

Also Published As

Publication number Publication date
US20140085184A1 (en) 2014-03-27
US8466873B2 (en) 2013-06-18
US20120112994A1 (en) 2012-05-10

Similar Documents

Publication Publication Date Title
US8466873B2 (en) Interaction techniques for flexible displays
Holman et al. Paper windows: interaction techniques for digital paper
Marquardt et al. The continuous interaction space: interaction techniques unifying touch and gesture on and above a digital surface
Khalilbeigi et al. FoldMe: interacting with double-sided foldable displays
Song et al. MouseLight: bimanual interactions on digital paper using a pen and a spatially-aware mobile projector
Aliakseyeu et al. A computer support tool for the early stages of architectural design
Akaoka et al. DisplayObjects: prototyping functional physical interfaces on 3d styrofoam, paper or cardboard models
Cao et al. Interacting with dynamically defined information spaces using a handheld projector and a pen
US20150309611A1 (en) Interaction techniques for flexible displays
US8416206B2 (en) Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system
Poupyrev et al. Developing a generic augmented-reality interface
Malik et al. Interacting with large displays from a distance with vision-tracked multi-finger gestural input
Girouard et al. DisplayStacks: interaction techniques for stacks of flexible thin-film displays
Waldner et al. Tangible tiles: design and evaluation of a tangible user interface in a collaborative tabletop setup
CN108431729A (en) To increase the three dimensional object tracking of display area
Wightman et al. TouchMark: flexible document navigation and bookmarking techniques for e-book readers.
Malik An exploration of multi-finger interaction on multi-touch surfaces
Jota et al. The continuous interaction space: Interaction techniques unifying touch and gesture on and above a digital surface
Rau et al. A tangible object for general purposes in mobile augmented reality applications
Banerjee et al. Waveform: remote video blending for vjs using in-air multitouch gestures
Tarun Electronic paper computers: Interacting with flexible displays for physical manipulation of digital information
Parhizkar et al. Development of an augmented reality rare book and manuscript for special library collection (AR Rare-BM)
Naito et al. Interaction techniques using a spherical cursor for 3d targets acquisition and indicating in volumetric displays
Nishimura et al. A digital contents management system using a real booklet interface with augmented reality
Benko User interaction in hybrid multi-display environments

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION