US20130009915A1 - Controlling responsiveness to user inputs on a touch-sensitive display - Google Patents

Controlling responsiveness to user inputs on a touch-sensitive display Download PDF

Info

Publication number
US20130009915A1
US20130009915A1 US13/179,124 US201113179124A US2013009915A1 US 20130009915 A1 US20130009915 A1 US 20130009915A1 US 201113179124 A US201113179124 A US 201113179124A US 2013009915 A1 US2013009915 A1 US 2013009915A1
Authority
US
United States
Prior art keywords
touch
sub
processor
sensitive display
interactions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/179,124
Inventor
Jean-Marc Hering
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nokia Oyj
Original Assignee
Nokia Oyj
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Oyj filed Critical Nokia Oyj
Priority to US13/179,124 priority Critical patent/US20130009915A1/en
Assigned to NOKIA CORPORATION reassignment NOKIA CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HERING, JEAN-MARC
Priority to PCT/IB2012/053436 priority patent/WO2013008151A1/en
Publication of US20130009915A1 publication Critical patent/US20130009915A1/en
Priority to US14/050,992 priority patent/US8717327B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • This invention relates to user input control on a touch-sensitive display, particularly the selective rejection of input controls in predetermined sub-region(s) of the display.
  • a first aspect of the invention provides apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor:
  • the computer code when executed may control the processor to subsequently to accept user interactions in the or each predetermined sub-region in response to receiving user interactions outside of said sub-region(s).
  • the computer code when executed may control the processor to subsequently to accept user interactions in the or each predetermined sub-region(s) in response to receiving user interactions made through the touch-sensitive display outside of said sub-region(s).
  • the apparatus may be a communications device, and the computer code when executed may control the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by a received telephone call.
  • the apparatus may be a communications device, the computer code when executed may control the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by means of a received data message.
  • the computer code when executed may control the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by means of an internal timer function.
  • the or each said sub-region may be at the perimeter of the touch-sensitive display.
  • the sub-regions comprise first and second separate sub-regions located on opposite perimeters of the touch-sensitive display.
  • the apparatus may be a mobile communications terminal.
  • a second aspect of the invention comprises a method comprising:
  • the invention also provides a computer program comprising instructions that when executed by computer apparatus control it to perform the method above.
  • a third aspect of the invention provides a non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by computing apparatus, causes the computing apparatus to perform a method comprising:
  • a fourth aspect of the invention provides apparatus comprising:
  • a fifth aspect of the invention provides apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor:
  • the computer code when executed may control the processor to reject interactions in said one or more perimeter sub-regions in response to a user interface window associated with the application being enlarged in accordance with a maximise command.
  • the software application may be configured to control the processor to enlarge the window in response to one or more received touch inputs.
  • the computer code when executed may control the processor to be responsive to interactions in said one or more perimeter sub-regions in response to a predetermined event.
  • the computer code when executed may control the processor to be responsive to interactions in said perimeter sub-region(s) in response to user interactions made outside of said one or more perimeter sub-regions.
  • the computer code when executed may control the processor to be responsive to interactions are subsequently accepted in response to user interactions made through the touch-sensitive display outside of said one or more perimeter sub-regions.
  • the display may be elongate having opposed lengthwise edges and widthways edges, and the one or more perimeter sub-regions comprise first and second separate sub-regions located on lengthwise, opposite perimeters of the touch-sensitive display.
  • the apparatus may be a mobile communications terminal.
  • a sixth aspect of the invention provides a method comprising:
  • the invention also provides a computer program comprising instructions that when executed by computer apparatus control it to perform the method above.
  • a seventh aspect of the invention provides a non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by computing apparatus, causes the computing apparatus to perform a method comprising:
  • An eighth aspect of the invention provides apparatus comprising:
  • FIG. 1 is a perspective view of a mobile terminal embodying aspects of the invention
  • FIG. 2 is a schematic diagram illustrating components of the FIG. 1 mobile terminal and their interconnection
  • FIG. 3 is a perspective diagram showing a representation of the mobile terminal shown in FIG. 1 when held in the hand of a user;
  • FIG. 4 is a block diagram showing the main operating modules providing user interaction control in accordance with embodiments of the invention.
  • FIGS. 5 a and 5 b show, respectively, the mobile terminal of FIG. 3 with dead, or non-active screen zones, shown overlaid, and a plan view of the screen with said dead zones shown overlaid;
  • FIGS. 6 a and 6 b show, respectively, a further example of the mobile terminal of FIG. 3 with dead zones shown overlaid, and a plan view of the screen with said dead zones shown overlaid;
  • FIG. 7 is a flow diagram showing the main processing steps performed by a display controller as shown in FIG. 4 operating in a first embodiment
  • FIGS. 8 a and 8 b show the screen of the mobile terminal shown in FIG. 3 , with an application window shows, respectively, at a normal size and at an enlarged size which is useful for understanding a second embodiment
  • FIG. 9 is a flow diagram showing the main processing steps performed by a display controller as shown in FIG. 4 operating in a second embodiment.
  • a terminal 100 is shown.
  • the exterior of the terminal 100 has a touch sensitive display 102 , hardware keys 104 , a speaker 118 and a headphone port 120 .
  • FIG. 2 shows a schematic diagram of the components of terminal 100 .
  • the terminal 100 has a controller 106 , a touch sensitive display 102 comprised of a display part 108 and a tactile interface part 110 , the hardware keys 104 , a memory 112 , RAM 114 , a speaker 118 , the headphone port 120 , a wireless communication module 122 , an antenna 124 and a battery 116 .
  • the controller 106 is connected to each of the other components (except the battery 116 ) in order to control operation thereof.
  • the memory 112 may be a non-volatile memory such as read only memory (ROM) a hard disk drive (HDD) or a solid state drive (SSD).
  • the memory 112 stores, amongst other things, an operating system 126 and may store software applications 128 .
  • the RAM 114 is used by the controller 106 for the temporary storage of data.
  • the operating system 126 may contain code which, when executed by the controller 106 in conjunction with RAM 114 , controls operation of each of the hardware components of the terminal.
  • the controller 106 may take any suitable form. For instance, it may be a microcontroller, plural microcontrollers, a processor, or plural processors.
  • the terminal 100 may be a mobile telephone or smartphone, a personal digital assistant (PDA), a portable media player (PMP), a portable computer or any other device capable of running software applications and providing audio outputs.
  • the terminal 100 may engage in cellular communications using the wireless communications module 122 and the antenna 124 .
  • the wireless communications module 122 may be configured to communicate via several protocols such as GSM, CDMA, UMTS, Bluetooth and IEEE 802.11 (Wi-Fi).
  • the display part 108 of the touch sensitive display 102 is for displaying images and text to users of the terminal and the tactile interface part 110 is for receiving touch inputs from users.
  • the memory 112 may also store multimedia files such as music and video files.
  • a wide variety of software applications 128 may be installed on the terminal including web browsers, radio and music players, games and utility applications. Some or all of the software applications stored on the terminal may provide audio outputs. The audio provided by the applications may be converted into sound by the speaker(s) 118 of the terminal or, if headphones or speakers have been connected to the headphone port 120 , by the headphones or speakers connected to the headphone port 120 .
  • the terminal 100 may also be associated with external software applications not stored on the terminal. These may be applications stored on a remote server device and may run partly or exclusively on the remote server device. These applications can be termed cloud-hosted applications.
  • the terminal 100 may be in communication with the remote server device in order to utilise the software application stored there. This may include receiving audio outputs provided by the external software application.
  • the hardware keys 104 are dedicated volume control keys or switches.
  • the hardware keys may for example comprise two adjacent keys, a single rocker switch or a rotary dial.
  • the hardware keys 104 are located on the side of the terminal 100 .
  • the terminal 100 is a thin-edge or thin-border type device in the sense that the touch-sensitive display (hereafter “display”) 102 occupies almost all of the main, active, face of the terminal, leaving only a very narrow border of casing surrounding it. This being the case, the display 102 is easily touched by the user when picking the terminal 100 up, when retrieving it from a pocket or when it is gripped generally. This is indicated in the representative diagram shown in FIG. 3 which shows the terminal 100 gripped in a user's hand. If the entire display 102 is active all of the time, there is the likelihood of touch commands or gestures being inadvertently inputted to the operating system 126 or applications 128 running on the processor 106 .
  • display touch-sensitive display
  • the terminal provides a display controller 132 which is operable, under certain conditions, to reject touch-based inputs made in certain region(s) of the display 102 . Specifically, inputs made in such region(s) are not transferred to the application or applications 128 running on the processor 106 .
  • Such region(s) may be termed “dead zone(s)” for ease of reference, although “inactive zone” is an alternative term.
  • the display controller 132 is implemented in software.
  • the display controller may be implemented as one or more modules forming part of the operating system 126 .
  • it may be provided as a software application that is external to the operating system 126 but is executed alongside and operates in conjunction with the operating system so as to operate as though it were part of the operating system.
  • other software applications may call on the display controller 132 so as to cause its functions to be effected.
  • the display controller 132 is provided as a module that forms part of one or more software applications. In this way, software applications that include the display controller module 132 benefit from its functions and the other software applications do not so benefit.
  • the display controller 132 operates in association with a set of so-called dead zone definitions 133 .
  • the dead zone definitions are stored within the terminal, for instance in the display controller 132 .
  • the dead zone definitions define one or more dead zones. They are defined in terms of screen co-ordinates or pixel addresses.
  • the dead zone definitions also define conditions in which the one or more dead zones are active or inactive. When the dead zones are active, the display controller 132 or the terminal 100 can be said to be in a first mode, and when the dead zones are inactive the display controller 132 or the terminal 100 can be said to be in a second mode.
  • the dead zone definitions 133 define that the or each dead zone is applied when the terminal 100 is woken from an idle state by an event triggered independently of user action, such as when a voice call is received over a wireless network, when a data message, e.g. an SMS message, is received over a wireless network, or when an internal time function such as a calendar appointment is notified to a user by software.
  • the terminal 100 is in an idle state when the entire display 102 is inactive due to the terminal being locked or is in a sleep mode (which can be set by a user or occur automatically after a predetermined period of non-use).
  • dead zone definitions 133 define that the or each dead zone reverts to being an active part of the display 102 , that is touch inputs are applied by the processor 106 to software applications 128 in the normal manner, following user action in the remaining, or active, region(s) of the display.
  • first and second dead zones 140 are defined at the lengthwise (lateral) perimeter edges of the display 102 , opposite one another.
  • the region 142 between the opposed dead zones 140 is the active zone and touch inputs and gestures made in this active zone are applied by the display controller 132 to software applications 128 even though the dead zones do not.
  • the dead zone definitions 133 define the pixel co-ordinates for the first and second dead zones 140 .
  • a single dead zone 150 is defined as a frame-like area adjacent the entire perimeter of the display 102 and surrounding an inner active zone 152 .
  • dead zone(s) indicated in FIGS. 5 and 6 correspond to the likely location of a user's fingers or palm when the terminal 100 is gripped during normal use.
  • step 7 . 1 the terminal 100 is assumed to be in an idle state.
  • step 7 . 2 the display controller 132 detects that the terminal 100 is woken from its idle state by an event triggered independent of user action, e.g. a phone call or SMS message received over a wireless network.
  • step 7 . 3 the display controller 132 applies the dead zone definitions 133 such that inputs received in the dead zones 140 , 150 of the display 102 are not transferred to control applications 128 running on the terminal 100 . Touch inputs received in the active zone 142 , 152 are transferred, however.
  • step 7 the display controller 132 detects that the terminal 100 is woken from its idle state by an event triggered independent of user action, e.g. a phone call or SMS message received over a wireless network.
  • step 7 . 3 the display controller 132 applies the dead zone definitions 133 such that inputs received in the dead zones 140 , 150 of the display 102 are not transferred to control applications 128 running on the terminal 100 . Touch inputs received in
  • the display controller 132 in response to detecting touch inputs received in the active zone 142 , 152 , subsequently de-applies the dead zones 140 , 150 such that touch inputs are transferred to control applications 128 running on the terminal 100 , i.e. step 7 . 5 . Otherwise, the dead zones 140 , 150 remain.
  • the dead zone definitions 133 may define that a predetermined number and/or sequence of touch inputs are required in the active zone 142 , 152 to de-apply the dead zones 140 , 150 , i.e. enter step 7 . 5 .
  • the dead zone definitions 133 define that the or each dead zone is/are applied when a user-interface window associated with an application 128 running on the terminal 100 is enlarged, either by user action or automatically by software control.
  • FIGS. 8 a and 8 b when a user is operating the terminal 100 by touch inputs made to the display 102 , they may enlarge, or as in the case of FIG. 8 , maximise an application window 170 on the terminal 100 usually by means of a one-touch input to a dedicated area 172 of that window. This causes the window 170 to enlarge automatically to occupy substantially the entire area of the display 102 . In this situation, it is possible that unintentional inputs may be applied to the relevant application 128 due to the user holding the terminal 100 in the manner shown in FIG. 3 . If the application 128 is an interne browser, for example, the user may accidentally activate a hyperlink after maximising the browser window. If the application 128 is a shopping list application, the user may accidentally select an item on the shopping list for editing or deletion. The application of one or more dead zone(s) 174 by the display controller 132 can reduce the chances of this occurring.
  • step 9 . 1 the display controller 132 detects that an application is being run and presents on the display 102 a window.
  • a second step 9 . 2 the display controller 132 detects that a touch input to enlarge or maximise the window has been made.
  • the display controller 132 applies the dead zone definitions 133 such that inputs received in the dead zones 140 , 150 of the display 102 are not transferred to control the application 128 running on the terminal 100 . Touch inputs received in the active zone 142 , 152 are transferred, however.
  • the display controller 132 in response to detecting touch inputs received in the active zone 142 , 152 , subsequently de-applies the dead zones 140 , 150 such that touch inputs are transferred to control the application 128 running on the terminal 100 , i.e. step 9 . 5 . Otherwise, the dead zones 140 , 150 remain.

Abstract

A communications terminal comprises a and a touch-sensitive display for displaying content generated by a software application associated with the processor and for receiving touch-based interactions for use with said software application. A display controller is operable to selectively accept and reject touch-based interactions in one or more predetermined sub-region(s) of the touch-sensitive display. Interactions in said sub-region(s) are rejected in response to said software application being woken from an idle state by an event triggered independent of user interaction and subsequently accepted in response to a predetermined event, for example by user action in an active zone outside of the sub-region(s).

Description

    FIELD
  • This invention relates to user input control on a touch-sensitive display, particularly the selective rejection of input controls in predetermined sub-region(s) of the display.
  • BACKGROUND
  • It is common for data terminals such as mobile telephones, data tablets and PDAs to provide a touch-sensitive display through which a user can interact with software executed on a processor of the terminal.
  • It is also common for displays to occupy a significant proportion of, and sometimes almost all of, the area available on a given side of the terminal, leaving a relatively thin border of casing surrounding the screen perimeter. Whilst this offers an increased area for displaying and interacting with software applications on the terminal, it can create practical difficulties. In particular, users can unintentionally interact with software applications when holding or picking-up the terminal at its peripheral edges.
  • SUMMARY
  • A first aspect of the invention provides apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor:
  • to cause display of content generated by a software application associated with the processor;
  • to receive signals indicative of touch inputs on a touch-sensitive display;
  • to respond to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
  • whilst in the first mode, to respond to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
  • The computer code when executed may control the processor to subsequently to accept user interactions in the or each predetermined sub-region in response to receiving user interactions outside of said sub-region(s).
  • The computer code when executed may control the processor to subsequently to accept user interactions in the or each predetermined sub-region(s) in response to receiving user interactions made through the touch-sensitive display outside of said sub-region(s).
  • The apparatus may be a communications device, and the computer code when executed may control the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by a received telephone call.
  • The apparatus may be a communications device, the computer code when executed may control the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by means of a received data message.
  • The computer code when executed may control the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by means of an internal timer function.
  • The or each said sub-region may be at the perimeter of the touch-sensitive display. The sub-regions comprise first and second separate sub-regions located on opposite perimeters of the touch-sensitive display.
  • The apparatus may be a mobile communications terminal.
  • A second aspect of the invention comprises a method comprising:
  • causing display of content generated by a software application associated with the processor;
  • receiving signals indicative of touch inputs on a touch-sensitive display;
  • responding to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
  • whilst in the first mode, responding to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
  • The invention also provides a computer program comprising instructions that when executed by computer apparatus control it to perform the method above.
  • A third aspect of the invention provides a non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by computing apparatus, causes the computing apparatus to perform a method comprising:
  • causing display of content generated by a software application associated with the processor;
  • receiving signals indicative of touch inputs on a touch-sensitive display;
  • responding to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
  • whilst in the first mode, responding to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
  • A fourth aspect of the invention provides apparatus comprising:
  • means for causing display of content generated by a software application associated with the processor;
  • means for receiving signals indicative of touch inputs on a touch-sensitive display;
  • means for responding to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
  • means for responding, whilst in the first mode, to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
  • A fifth aspect of the invention provides apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor:
  • to cause display of content generated by a software application associated with the processor;
  • to receive signals indicative of touch inputs on a touch-sensitive display; and
  • to respond to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
  • The computer code when executed may control the processor to reject interactions in said one or more perimeter sub-regions in response to a user interface window associated with the application being enlarged in accordance with a maximise command.
  • The software application may be configured to control the processor to enlarge the window in response to one or more received touch inputs.
  • The computer code when executed may control the processor to be responsive to interactions in said one or more perimeter sub-regions in response to a predetermined event.
  • The computer code when executed may control the processor to be responsive to interactions in said perimeter sub-region(s) in response to user interactions made outside of said one or more perimeter sub-regions.
  • The computer code when executed may control the processor to be responsive to interactions are subsequently accepted in response to user interactions made through the touch-sensitive display outside of said one or more perimeter sub-regions.
  • The display may be elongate having opposed lengthwise edges and widthways edges, and the one or more perimeter sub-regions comprise first and second separate sub-regions located on lengthwise, opposite perimeters of the touch-sensitive display.
  • Therein the apparatus may be a mobile communications terminal.
  • A sixth aspect of the invention provides a method comprising:
  • causing display of content generated by a software application associated with the processor;
  • receiving signals indicative of touch inputs on a touch-sensitive display; and
  • responding to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
  • The invention also provides a computer program comprising instructions that when executed by computer apparatus control it to perform the method above.
  • A seventh aspect of the invention provides a non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by computing apparatus, causes the computing apparatus to perform a method comprising:
  • causing display of content generated by a software application associated with the processor;
  • receiving signals indicative of touch inputs on a touch-sensitive display; and
  • responding to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
  • An eighth aspect of the invention provides apparatus comprising:
  • means for causing display of content generated by a software application associated with the processor;
  • means for receiving signals indicative of touch inputs on a touch-sensitive display; and
  • means for responding to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
  • BRIEF DESCRIPTION
  • Embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which:
  • FIG. 1 is a perspective view of a mobile terminal embodying aspects of the invention;
  • FIG. 2 is a schematic diagram illustrating components of the FIG. 1 mobile terminal and their interconnection;
  • FIG. 3 is a perspective diagram showing a representation of the mobile terminal shown in FIG. 1 when held in the hand of a user;
  • FIG. 4 is a block diagram showing the main operating modules providing user interaction control in accordance with embodiments of the invention;
  • FIGS. 5 a and 5 b show, respectively, the mobile terminal of FIG. 3 with dead, or non-active screen zones, shown overlaid, and a plan view of the screen with said dead zones shown overlaid;
  • FIGS. 6 a and 6 b show, respectively, a further example of the mobile terminal of FIG. 3 with dead zones shown overlaid, and a plan view of the screen with said dead zones shown overlaid;
  • FIG. 7 is a flow diagram showing the main processing steps performed by a display controller as shown in FIG. 4 operating in a first embodiment;
  • FIGS. 8 a and 8 b show the screen of the mobile terminal shown in FIG. 3, with an application window shows, respectively, at a normal size and at an enlarged size which is useful for understanding a second embodiment; and
  • FIG. 9 is a flow diagram showing the main processing steps performed by a display controller as shown in FIG. 4 operating in a second embodiment.
  • DETAILED DESCRIPTION
  • Referring firstly to FIG. 1, a terminal 100 is shown. The exterior of the terminal 100 has a touch sensitive display 102, hardware keys 104, a speaker 118 and a headphone port 120.
  • FIG. 2 shows a schematic diagram of the components of terminal 100. The terminal 100 has a controller 106, a touch sensitive display 102 comprised of a display part 108 and a tactile interface part 110, the hardware keys 104, a memory 112, RAM 114, a speaker 118, the headphone port 120, a wireless communication module 122, an antenna 124 and a battery 116. The controller 106 is connected to each of the other components (except the battery 116) in order to control operation thereof.
  • The memory 112 may be a non-volatile memory such as read only memory (ROM) a hard disk drive (HDD) or a solid state drive (SSD). The memory 112 stores, amongst other things, an operating system 126 and may store software applications 128. The RAM 114 is used by the controller 106 for the temporary storage of data. The operating system 126 may contain code which, when executed by the controller 106 in conjunction with RAM 114, controls operation of each of the hardware components of the terminal.
  • The controller 106 may take any suitable form. For instance, it may be a microcontroller, plural microcontrollers, a processor, or plural processors.
  • The terminal 100 may be a mobile telephone or smartphone, a personal digital assistant (PDA), a portable media player (PMP), a portable computer or any other device capable of running software applications and providing audio outputs. In some embodiments, the terminal 100 may engage in cellular communications using the wireless communications module 122 and the antenna 124. The wireless communications module 122 may be configured to communicate via several protocols such as GSM, CDMA, UMTS, Bluetooth and IEEE 802.11 (Wi-Fi).
  • The display part 108 of the touch sensitive display 102 is for displaying images and text to users of the terminal and the tactile interface part 110 is for receiving touch inputs from users.
  • As well as storing the operating system 126 and software applications 128, the memory 112 may also store multimedia files such as music and video files. A wide variety of software applications 128 may be installed on the terminal including web browsers, radio and music players, games and utility applications. Some or all of the software applications stored on the terminal may provide audio outputs. The audio provided by the applications may be converted into sound by the speaker(s) 118 of the terminal or, if headphones or speakers have been connected to the headphone port 120, by the headphones or speakers connected to the headphone port 120.
  • In some embodiments the terminal 100 may also be associated with external software applications not stored on the terminal. These may be applications stored on a remote server device and may run partly or exclusively on the remote server device. These applications can be termed cloud-hosted applications. The terminal 100 may be in communication with the remote server device in order to utilise the software application stored there. This may include receiving audio outputs provided by the external software application.
  • In some embodiments, the hardware keys 104 are dedicated volume control keys or switches. The hardware keys may for example comprise two adjacent keys, a single rocker switch or a rotary dial. In some embodiments, the hardware keys 104 are located on the side of the terminal 100.
  • As will be appreciated from FIG. 1, the terminal 100 is a thin-edge or thin-border type device in the sense that the touch-sensitive display (hereafter “display”) 102 occupies almost all of the main, active, face of the terminal, leaving only a very narrow border of casing surrounding it. This being the case, the display 102 is easily touched by the user when picking the terminal 100 up, when retrieving it from a pocket or when it is gripped generally. This is indicated in the representative diagram shown in FIG. 3 which shows the terminal 100 gripped in a user's hand. If the entire display 102 is active all of the time, there is the likelihood of touch commands or gestures being inadvertently inputted to the operating system 126 or applications 128 running on the processor 106.
  • To counter this, and referring now to FIG. 4, the terminal provides a display controller 132 which is operable, under certain conditions, to reject touch-based inputs made in certain region(s) of the display 102. Specifically, inputs made in such region(s) are not transferred to the application or applications 128 running on the processor 106. Such region(s) may be termed “dead zone(s)” for ease of reference, although “inactive zone” is an alternative term.
  • The display controller 132 is implemented in software. For instance, the display controller may be implemented as one or more modules forming part of the operating system 126. Alternatively, it may be provided as a software application that is external to the operating system 126 but is executed alongside and operates in conjunction with the operating system so as to operate as though it were part of the operating system. Here, other software applications may call on the display controller 132 so as to cause its functions to be effected. Alternatively, the display controller 132 is provided as a module that forms part of one or more software applications. In this way, software applications that include the display controller module 132 benefit from its functions and the other software applications do not so benefit.
  • The display controller 132 operates in association with a set of so-called dead zone definitions 133. The dead zone definitions are stored within the terminal, for instance in the display controller 132. The dead zone definitions define one or more dead zones. They are defined in terms of screen co-ordinates or pixel addresses. The dead zone definitions also define conditions in which the one or more dead zones are active or inactive. When the dead zones are active, the display controller 132 or the terminal 100 can be said to be in a first mode, and when the dead zones are inactive the display controller 132 or the terminal 100 can be said to be in a second mode.
  • In a first embodiment, the dead zone definitions 133 define that the or each dead zone is applied when the terminal 100 is woken from an idle state by an event triggered independently of user action, such as when a voice call is received over a wireless network, when a data message, e.g. an SMS message, is received over a wireless network, or when an internal time function such as a calendar appointment is notified to a user by software. In the context of this disclosure, the terminal 100 is in an idle state when the entire display 102 is inactive due to the terminal being locked or is in a sleep mode (which can be set by a user or occur automatically after a predetermined period of non-use).
  • Further, the dead zone definitions 133 define that the or each dead zone reverts to being an active part of the display 102, that is touch inputs are applied by the processor 106 to software applications 128 in the normal manner, following user action in the remaining, or active, region(s) of the display.
  • Referring to FIGS. 5 and 6, alternative definitions for the dead and active zones are indicated overlaid on the display 102. Referring to FIGS. 5 a and 5 b, first and second dead zones 140 are defined at the lengthwise (lateral) perimeter edges of the display 102, opposite one another. The region 142 between the opposed dead zones 140 is the active zone and touch inputs and gestures made in this active zone are applied by the display controller 132 to software applications 128 even though the dead zones do not. The dead zone definitions 133 define the pixel co-ordinates for the first and second dead zones 140.
  • Referring to FIGS. 6 a and 6 b, a single dead zone 150 is defined as a frame-like area adjacent the entire perimeter of the display 102 and surrounding an inner active zone 152.
  • It will be appreciated that the dead zone(s) indicated in FIGS. 5 and 6 correspond to the likely location of a user's fingers or palm when the terminal 100 is gripped during normal use.
  • Referring now to FIG. 7, the main processing steps applied by the display controller 132 are shown. In an initial step 7.1, the terminal 100 is assumed to be in an idle state. In a second step 7.2, the display controller 132 detects that the terminal 100 is woken from its idle state by an event triggered independent of user action, e.g. a phone call or SMS message received over a wireless network. In a subsequent step 7.3, the display controller 132 applies the dead zone definitions 133 such that inputs received in the dead zones 140, 150 of the display 102 are not transferred to control applications 128 running on the terminal 100. Touch inputs received in the active zone 142, 152 are transferred, however. In step 7.4, the display controller 132, in response to detecting touch inputs received in the active zone 142, 152, subsequently de-applies the dead zones 140, 150 such that touch inputs are transferred to control applications 128 running on the terminal 100, i.e. step 7.5. Otherwise, the dead zones 140, 150 remain.
  • The dead zone definitions 133 may define that a predetermined number and/or sequence of touch inputs are required in the active zone 142, 152 to de-apply the dead zones 140, 150, i.e. enter step 7.5.
  • In a second embodiment, the dead zone definitions 133 define that the or each dead zone is/are applied when a user-interface window associated with an application 128 running on the terminal 100 is enlarged, either by user action or automatically by software control.
  • For example, as shown in FIGS. 8 a and 8 b, when a user is operating the terminal 100 by touch inputs made to the display 102, they may enlarge, or as in the case of FIG. 8, maximise an application window 170 on the terminal 100 usually by means of a one-touch input to a dedicated area 172 of that window. This causes the window 170 to enlarge automatically to occupy substantially the entire area of the display 102. In this situation, it is possible that unintentional inputs may be applied to the relevant application 128 due to the user holding the terminal 100 in the manner shown in FIG. 3. If the application 128 is an interne browser, for example, the user may accidentally activate a hyperlink after maximising the browser window. If the application 128 is a shopping list application, the user may accidentally select an item on the shopping list for editing or deletion. The application of one or more dead zone(s) 174 by the display controller 132 can reduce the chances of this occurring.
  • Subsequently, user interaction in the active zone 176 is detected by the display controller 132 and the dead-zone(s) 174 de-applied to make the entire display active to receive user inputs for controlling the application.
  • Referring now to FIG. 9, the main processing steps applied by the display controller 132 in this second embodiment are shown. In an initial step 9.1, the display controller 132 detects that an application is being run and presents on the display 102 a window. In a second step 9.2, the display controller 132 detects that a touch input to enlarge or maximise the window has been made. In a subsequent step 9.3, the display controller 132 applies the dead zone definitions 133 such that inputs received in the dead zones 140, 150 of the display 102 are not transferred to control the application 128 running on the terminal 100. Touch inputs received in the active zone 142, 152 are transferred, however. In step 9.4, the display controller 132, in response to detecting touch inputs received in the active zone 142, 152, subsequently de-applies the dead zones 140, 150 such that touch inputs are transferred to control the application 128 running on the terminal 100, i.e. step 9.5. Otherwise, the dead zones 140, 150 remain.
  • It will be appreciated that the above described embodiments are purely illustrative and are not limiting on the scope of the invention. Other variations and modifications will be apparent to persons skilled in the art upon reading the present application.
  • Moreover, the disclosure of the present application should be understood to include any novel features or any novel combination of features either explicitly or implicitly disclosed herein or any generalization thereof and during the prosecution of the present application or of any application derived therefrom, new claims may be formulated to cover any such features and/or combination of such features.

Claims (25)

1. Apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor:
to cause display of content generated by a software application associated with the processor;
to receive signals indicative of touch inputs on a touch-sensitive display;
to respond to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
whilst in the first mode, to respond to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
2. Apparatus according to claim 1, wherein the computer code when executed controls the processor to subsequently accept user interactions in the or each predetermined sub-region in response to receiving user interactions outside of said sub-region(s).
3. Apparatus according to claim 2, wherein the computer code when executed controls the processor to subsequently accept user interactions in the or each predetermined sub-region(s) in response to receiving user interactions made through the touch-sensitive display outside of said sub-region(s).
4. Apparatus according to claim 1, wherein the apparatus is a communications device, the computer code when executed controls the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by a received telephone call.
5. Apparatus according to claim 1, wherein the apparatus is a communications device, the computer code when executed controls the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by means of a received data message.
6. Apparatus according to claim 1, wherein the computer code when executed controls the processor to be unresponsive to interactions in said sub-region(s) in response to said software application being woken from an idle state by means of an internal timer function.
7. Apparatus according to claim 1, wherein the or each said sub-region is at the perimeter of the touch-sensitive display.
8. Apparatus according to claim 7, wherein the sub-regions comprise first and second separate sub-regions located on opposite perimeters of the touch-sensitive display.
9. Apparatus according to claim 1, wherein the apparatus is a mobile communications terminal.
10. A method comprising:
causing display of content generated by a software application associated with the processor;
receiving signals indicative of touch inputs on a touch-sensitive display;
responding to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
whilst in the first mode, responding to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
11. (canceled)
12. A non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by computing apparatus, causes the computing apparatus to perform a method comprising:
causing display of content generated by a software application associated with the processor;
receiving signals indicative of touch inputs on a touch-sensitive display;
responding to detection of the software application being woken from an idle state by entering a first mode in which touch-based interactions in one or more predetermined sub-regions of the touch-sensitive display are rejected, and
whilst in the first mode, responding to detection of a touch-based interaction without the one or more predetermined sub-regions by exiting the first mode and thereafter to respond to touch-based interactions in said one or more predetermined sub-region of the touch-sensitive display.
13. (canceled)
14. Apparatus comprising at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor:
to cause display of content generated by a software application associated with the processor;
to receive signals indicative of touch inputs on a touch-sensitive display; and
to respond to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
15. Apparatus according to claim 14, wherein the computer code when executed controls the processor to reject interactions in said one or more perimeter sub-regions in response to a user interface window associated with the application being enlarged in accordance with a maximise command.
16. Apparatus according to claim 14, wherein the software application is configured to control the processor to enlarge the window in response to one or more received touch inputs.
17. Apparatus according to claim 14, wherein the computer code when executed controls the processor to be responsive to interactions in said one or more perimeter sub-regions in response to a predetermined event.
18. Apparatus according to claim 17, wherein the computer code when executed controls the processor to be responsive to interactions in said perimeter sub-region(s) in response to user interactions made outside of said one or more perimeter sub-regions, and optionally wherein the computer code when executed controls the processor to be responsive to interactions are subsequently accepted in response to user interactions made through the touch-sensitive display outside of said one or more perimeter sub-regions.
19. (canceled)
20. Apparatus according to claim 14, wherein the display is elongate having opposed lengthwise edges and widthways edges, and the one or more perimeter sub-regions comprise first and second separate sub-regions located on lengthwise, opposite perimeters of the touch-sensitive display.
21. Apparatus according to claim 14, wherein the apparatus is a mobile communications terminal.
22. A method comprising:
causing display of content generated by a software application associated with the processor;
receiving signals indicative of touch inputs on a touch-sensitive display; and
responding to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
23. (canceled)
24. A non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by computing apparatus, causes the computing apparatus to perform a method comprising:
causing display of content generated by a software application associated with the processor;
receiving signals indicative of touch inputs on a touch-sensitive display; and
responding to detection of a content window provided by the software application being enlarged by entering a first mode in which touch-based interactions in one or more predetermined sub-regions at a perimeter of the touch-sensitive display are rejected.
25. (canceled)
US13/179,124 2011-07-08 2011-07-08 Controlling responsiveness to user inputs on a touch-sensitive display Abandoned US20130009915A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/179,124 US20130009915A1 (en) 2011-07-08 2011-07-08 Controlling responsiveness to user inputs on a touch-sensitive display
PCT/IB2012/053436 WO2013008151A1 (en) 2011-07-08 2012-07-05 Controlling responsiveness to user inputs on a touch-sensitive display
US14/050,992 US8717327B2 (en) 2011-07-08 2013-10-10 Controlling responsiveness to user inputs on a touch-sensitive display

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/179,124 US20130009915A1 (en) 2011-07-08 2011-07-08 Controlling responsiveness to user inputs on a touch-sensitive display

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/050,992 Division US8717327B2 (en) 2011-07-08 2013-10-10 Controlling responsiveness to user inputs on a touch-sensitive display

Publications (1)

Publication Number Publication Date
US20130009915A1 true US20130009915A1 (en) 2013-01-10

Family

ID=47438367

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/179,124 Abandoned US20130009915A1 (en) 2011-07-08 2011-07-08 Controlling responsiveness to user inputs on a touch-sensitive display
US14/050,992 Active US8717327B2 (en) 2011-07-08 2013-10-10 Controlling responsiveness to user inputs on a touch-sensitive display

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/050,992 Active US8717327B2 (en) 2011-07-08 2013-10-10 Controlling responsiveness to user inputs on a touch-sensitive display

Country Status (2)

Country Link
US (2) US20130009915A1 (en)
WO (1) WO2013008151A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130069886A1 (en) * 2011-09-16 2013-03-21 Wan-Qiu Wang Edge grip detection method of a touch panel and a device using the same
US20140053111A1 (en) * 2012-08-14 2014-02-20 Christopher V. Beckman System for Managing Computer Interface Input and Output
US20140123016A1 (en) * 2012-10-29 2014-05-01 Adobe Systems Incorporated Enhancement of touch user experiences
JP2014174800A (en) * 2013-03-11 2014-09-22 Ricoh Co Ltd Display system, display device, and program
WO2014172454A1 (en) * 2013-04-16 2014-10-23 Cirque Corporation Graduated palm rejection to improve touch sensor performance
WO2014171606A1 (en) * 2013-04-19 2014-10-23 Lg Electronics Inc. Device for controlling mobile terminal and method of controlling the mobile terminal
EP2804088A3 (en) * 2013-05-16 2015-03-11 Samsung Electronics Co., Ltd Mobile terminal and control method thereof
US20150363086A1 (en) * 2013-02-19 2015-12-17 Nec Corporation Information processing terminal, screen control method, and screen control program
US20160266785A1 (en) * 2015-03-09 2016-09-15 Scannx, Inc. Method To Automatically And Selectively Mask End User Controls In HTML Rendered Content
EP3037927A4 (en) * 2013-08-19 2017-04-12 Sony Corporation Information processing apparatus and information processing method
EP2778886A3 (en) * 2013-03-14 2017-04-12 Samsung Electronics Co., Ltd. Mobile device of executing action in display unchecking mode and method of controlling the same
EP3093746A4 (en) * 2014-01-07 2017-08-09 Huizhou TCL Mobile Communication Co., Ltd. Mobile terminal and mobile terminal menu item setting method and device

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD738370S1 (en) * 2013-07-04 2015-09-08 Lg Electronics Inc. Tablet PC
TWD161239S (en) * 2013-07-04 2014-06-21 Lg電子股份有限公司 Tablet pc
KR102223277B1 (en) * 2014-01-06 2021-03-05 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
USD746281S1 (en) * 2014-02-03 2015-12-29 Lg Electronics Inc. Tablet computer
US9626020B2 (en) 2014-09-12 2017-04-18 Microsoft Corporation Handedness detection from touch input
US9804707B2 (en) 2014-09-12 2017-10-31 Microsoft Technology Licensing, Llc Inactive region for touch surface based on contextual information
US9898126B2 (en) 2015-03-31 2018-02-20 Toshiba Global Commerce Solutions Holdings Corporation User defined active zones for touch screen displays on hand held device
US10057999B2 (en) 2016-05-24 2018-08-21 Microsoft Technology Licensing, Llc Electronic device having a reduced dead border
JP6784115B2 (en) * 2016-09-23 2020-11-11 コニカミノルタ株式会社 Ultrasound diagnostic equipment and programs

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030034185A1 (en) * 2001-08-13 2003-02-20 Nokia Mobile Phones Ltd. Method for preventing unintended touch pad input due to accidental touching
US20040085351A1 (en) * 2002-09-20 2004-05-06 Nokia Corporation Method of deactivating device lock state, and electronic device
US6813344B1 (en) * 2001-08-29 2004-11-02 Palm Source, Inc. Method and system for providing information for identifying callers based on a partial number
US20060082556A1 (en) * 2004-05-18 2006-04-20 Interlink Electronics, Inc. Annular potentiometric touch sensor
US20060087993A1 (en) * 2004-10-27 2006-04-27 Sengupta Uttam K Methods and apparatus for providing a communication proxy system
US7047005B2 (en) * 2001-12-18 2006-05-16 Motorola, Inc. Method and mobile station for enabling a preferred slot cycle
US20090262078A1 (en) * 2008-04-21 2009-10-22 David Pizzi Cellular phone with special sensor functions
US20090305732A1 (en) * 2008-06-06 2009-12-10 Chris Marcellino Managing notification service connections and displaying icon badges
US20090307519A1 (en) * 2008-06-04 2009-12-10 Edward Craig Hyatt Power saving scheduler for timed events
US20100067723A1 (en) * 2007-04-10 2010-03-18 Oticon A/S User interface for a communications device
US20100295559A1 (en) * 2009-05-22 2010-11-25 Freescale Semiconductor, Inc. Device with proximity detection capability
US20110081889A1 (en) * 2009-10-02 2011-04-07 Research In Motion Limited Method of interacting with electronic devices in a locked state and handheld electronic device configured to permit interaction when in a locked state
US20110159844A1 (en) * 2009-12-28 2011-06-30 Nokia Corporation Method and apparatus for user interaction while device is locked
US20110307727A1 (en) * 2009-02-20 2011-12-15 Wei Wu Computer with Built-in Wireless Module and Standby and Activate Method Thereof
US20120218282A1 (en) * 2011-02-25 2012-08-30 Research In Motion Limited Display Brightness Adjustment
US20120235790A1 (en) * 2011-03-16 2012-09-20 Apple Inc. Locking and unlocking a mobile device using facial recognition
US20120249461A1 (en) * 2011-04-01 2012-10-04 Analog Devices, Inc. Dedicated user interface controller for feedback responses
US20120297212A1 (en) * 2006-01-11 2012-11-22 Microsoft Corporation Network event notification and delivery
US20120315929A1 (en) * 2011-06-08 2012-12-13 Oshinsky Stephen Systems and methods for communicating with a paging network operations center through wireless cellular devices

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7847789B2 (en) * 2004-11-23 2010-12-07 Microsoft Corporation Reducing accidental touch-sensitive device activation
US20070026371A1 (en) * 2005-08-01 2007-02-01 Beryl Wood Personal electronic text library system patent
US8059100B2 (en) * 2005-11-17 2011-11-15 Lg Electronics Inc. Method for allocating/arranging keys on touch-screen, and mobile terminal for use of the same
US7657849B2 (en) * 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8018440B2 (en) 2005-12-30 2011-09-13 Microsoft Corporation Unintentional touch rejection
KR100803607B1 (en) 2006-10-19 2008-02-15 삼성전자주식회사 Touch sensor unit and method for controlling sensitivity of the same
US7884807B2 (en) * 2007-05-15 2011-02-08 Synaptics Incorporated Proximity sensor and method for indicating a display orientation change
US8154523B2 (en) * 2007-12-13 2012-04-10 Eastman Kodak Company Electronic device, display and touch-sensitive user interface
US20090174679A1 (en) * 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
US8004501B2 (en) * 2008-01-21 2011-08-23 Sony Computer Entertainment America Llc Hand-held device with touchscreen and digital tactile pixels
US8259080B2 (en) * 2008-03-31 2012-09-04 Dell Products, Lp Information handling system display device and methods thereof
KR101537683B1 (en) * 2008-09-02 2015-07-20 엘지전자 주식회사 Portable terminal
US20100107067A1 (en) * 2008-10-27 2010-04-29 Nokia Corporation Input on touch based user interfaces
US20100295799A1 (en) * 2009-05-21 2010-11-25 Sony Computer Entertainment America Inc. Touch screen disambiguation based on prior ancillary touch input
US20100302212A1 (en) * 2009-06-02 2010-12-02 Microsoft Corporation Touch personalization for a display device
US20110069021A1 (en) * 2009-06-12 2011-03-24 Hill Jared C Reducing false touchpad data by ignoring input when area gesture does not behave as predicted
WO2012129670A1 (en) * 2011-03-31 2012-10-04 Smart Technologies Ulc Manipulating graphical objects γν a multi-touch interactive system
US20130234929A1 (en) * 2012-03-07 2013-09-12 Evernote Corporation Adapting mobile user interface to unfavorable usage conditions
US9098192B2 (en) * 2012-05-11 2015-08-04 Perceptive Pixel, Inc. Overscan display device and method of using the same

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030034185A1 (en) * 2001-08-13 2003-02-20 Nokia Mobile Phones Ltd. Method for preventing unintended touch pad input due to accidental touching
US6813344B1 (en) * 2001-08-29 2004-11-02 Palm Source, Inc. Method and system for providing information for identifying callers based on a partial number
US7047005B2 (en) * 2001-12-18 2006-05-16 Motorola, Inc. Method and mobile station for enabling a preferred slot cycle
US20040085351A1 (en) * 2002-09-20 2004-05-06 Nokia Corporation Method of deactivating device lock state, and electronic device
US20060082556A1 (en) * 2004-05-18 2006-04-20 Interlink Electronics, Inc. Annular potentiometric touch sensor
US20060087993A1 (en) * 2004-10-27 2006-04-27 Sengupta Uttam K Methods and apparatus for providing a communication proxy system
US20120297212A1 (en) * 2006-01-11 2012-11-22 Microsoft Corporation Network event notification and delivery
US20100067723A1 (en) * 2007-04-10 2010-03-18 Oticon A/S User interface for a communications device
US20090262078A1 (en) * 2008-04-21 2009-10-22 David Pizzi Cellular phone with special sensor functions
US20090307519A1 (en) * 2008-06-04 2009-12-10 Edward Craig Hyatt Power saving scheduler for timed events
US20090305732A1 (en) * 2008-06-06 2009-12-10 Chris Marcellino Managing notification service connections and displaying icon badges
US20110307727A1 (en) * 2009-02-20 2011-12-15 Wei Wu Computer with Built-in Wireless Module and Standby and Activate Method Thereof
US20100295559A1 (en) * 2009-05-22 2010-11-25 Freescale Semiconductor, Inc. Device with proximity detection capability
US20110081889A1 (en) * 2009-10-02 2011-04-07 Research In Motion Limited Method of interacting with electronic devices in a locked state and handheld electronic device configured to permit interaction when in a locked state
US20110159844A1 (en) * 2009-12-28 2011-06-30 Nokia Corporation Method and apparatus for user interaction while device is locked
US20120218282A1 (en) * 2011-02-25 2012-08-30 Research In Motion Limited Display Brightness Adjustment
US20120235790A1 (en) * 2011-03-16 2012-09-20 Apple Inc. Locking and unlocking a mobile device using facial recognition
US20120249461A1 (en) * 2011-04-01 2012-10-04 Analog Devices, Inc. Dedicated user interface controller for feedback responses
US20120315929A1 (en) * 2011-06-08 2012-12-13 Oshinsky Stephen Systems and methods for communicating with a paging network operations center through wireless cellular devices

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130069886A1 (en) * 2011-09-16 2013-03-21 Wan-Qiu Wang Edge grip detection method of a touch panel and a device using the same
US8963859B2 (en) * 2011-09-16 2015-02-24 Tpk Touch Solutions (Xiamen) Inc. Edge grip detection method of a touch panel and a device using the same
US20140053111A1 (en) * 2012-08-14 2014-02-20 Christopher V. Beckman System for Managing Computer Interface Input and Output
US9032335B2 (en) * 2012-08-14 2015-05-12 Christopher V. Beckman User interface techniques reducing the impact of movements
US20140123016A1 (en) * 2012-10-29 2014-05-01 Adobe Systems Incorporated Enhancement of touch user experiences
US9262523B2 (en) * 2012-10-29 2016-02-16 Adobe Systems Incorporated Enhancement of touch user experiences
US20150363086A1 (en) * 2013-02-19 2015-12-17 Nec Corporation Information processing terminal, screen control method, and screen control program
JP2014174800A (en) * 2013-03-11 2014-09-22 Ricoh Co Ltd Display system, display device, and program
EP2778886A3 (en) * 2013-03-14 2017-04-12 Samsung Electronics Co., Ltd. Mobile device of executing action in display unchecking mode and method of controlling the same
WO2014172454A1 (en) * 2013-04-16 2014-10-23 Cirque Corporation Graduated palm rejection to improve touch sensor performance
WO2014171606A1 (en) * 2013-04-19 2014-10-23 Lg Electronics Inc. Device for controlling mobile terminal and method of controlling the mobile terminal
US10078365B2 (en) 2013-04-19 2018-09-18 Lg Electronics Inc. Device for controlling mobile terminal and method of controlling the mobile terminal
EP2804088A3 (en) * 2013-05-16 2015-03-11 Samsung Electronics Co., Ltd Mobile terminal and control method thereof
US9529471B2 (en) 2013-05-16 2016-12-27 Samsung Electronics Co., Ltd. Mobile terminal and control method thereof
EP3037927A4 (en) * 2013-08-19 2017-04-12 Sony Corporation Information processing apparatus and information processing method
US10007382B2 (en) 2013-08-19 2018-06-26 Sony Corporation Information processing apparatus and information processing method
EP3093746A4 (en) * 2014-01-07 2017-08-09 Huizhou TCL Mobile Communication Co., Ltd. Mobile terminal and mobile terminal menu item setting method and device
US20160266785A1 (en) * 2015-03-09 2016-09-15 Scannx, Inc. Method To Automatically And Selectively Mask End User Controls In HTML Rendered Content

Also Published As

Publication number Publication date
US20140035873A1 (en) 2014-02-06
WO2013008151A1 (en) 2013-01-17
US8717327B2 (en) 2014-05-06

Similar Documents

Publication Publication Date Title
US8717327B2 (en) Controlling responsiveness to user inputs on a touch-sensitive display
US10917515B2 (en) Method for switching applications in split screen mode, computer device and computer-readable storage medium
CN108701001B (en) Method for displaying graphical user interface and electronic equipment
EP2555497B1 (en) Controlling responsiveness to user inputs
KR101761190B1 (en) Method and apparatus for providing user interface in portable terminal
US9213467B2 (en) Interaction method and interaction device
US9557806B2 (en) Power save mode in electronic apparatus
US20110193805A1 (en) Screen control method and apparatus for mobile terminal having multiple touch screens
WO2017032123A1 (en) Video playback control method and device
US20140189584A1 (en) Method for switching applications in user interface and electronic apparatus using the same
US9173086B2 (en) Method and apparatus for preventing screen off during automatic response system service in electronic device
US11812323B2 (en) Method and apparatus for triggering terminal behavior based on environmental and terminal status parameters
CN109923507A (en) Multiple free windows are managed in notification bar drop-down menu
US10592099B2 (en) Device and method of controlling the device
KR20180120768A (en) Man-machine interaction methods, devices and graphical user interfaces
US8195123B2 (en) Call origination method for full-touch screen portable terminal
US20190306298A1 (en) Method and apparatus for preventing screen off during automatic response system service in electronic device
EP3220261A1 (en) Multiple display device and method of operating the same
JP2018507491A (en) Method and apparatus for processing new messages associated with an application
US10311254B2 (en) Electronic apparatus and information access control method thereof
US20110205174A1 (en) Method and apparatus for collecting touch event of terminal
WO2022052470A1 (en) Method for operating widget, terminal and storage medium
EP3528103B1 (en) Screen locking method, terminal and screen locking device
US20140033231A1 (en) Electronic device with a function of alerting running applications and method thereof
CN112187973B (en) Terminal equipment and method for processing incoming call

Legal Events

Date Code Title Description
AS Assignment

Owner name: NOKIA CORPORATION, FINLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HERING, JEAN-MARC;REEL/FRAME:026944/0493

Effective date: 20110815

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE